On Mon, Jan 13, 2025, 4:46 AM Greg Wooledge <g...@wooledge.org> wrote:
> On Mon, Jan 13, 2025 at 08:42:39 +0700, Robert Elz wrote: > > It is unexpected, because users believe (from other experience) > > that the delimiters separate fields, but in sh they don't, they > > terminate fields. > > At the risk of going a bit off topic, may I ask *why* the shell does > that? Were there any files or data stream conventions in common use > in the 1970s that would have prompted that design decision? > > Files like /etc/group and /etc/passwd certainly don't conform to > that data format, so the sh behavior definitely didn't arise from either > of those. > > The only realistic example I can think of would be something like > printf '%s\0' "$@" to serialize the positional parameters into a stream > i learned , \0 data is the only >general< stdin stdout separation way or similiar self developed custom delimiters stuff no idea about all the texts discussed here .. of NUL-terminated C strings. But the Bourne shell doesn't have the means > of reading such a stream (it has no read -d '', and you certainly can't > use NUL as an IFS character), so I don't think that was what Mr. Bourne > had in mind either. (Not to mention, the Bourne shell didn't have printf > either.) > > It just feels like the shell is trying to do a job that nobody actually > wants done. > >