Hi Larry,
> Groff was barfing with a "too many open files" error, which I worked around
> by doing "ulimit -n 768" first. I ended up with over 640 HTML files. I'm not
> sure why all every output file has to be open at once, though.
Sounds like either a bug or a design artifact that'll need fix
Ralph Corderoy wrote:
> Hi Larry,
>
>> Groff was barfing with a "too many open files" error, which I worked around
>> by doing "ulimit -n 768" first. I ended up with over 640 HTML files. I'm not
>> sure why all every output file has to be open at once, though.
>
>
> $
>
> What status clo
Hi Wh,
> Ralph Corderoy wrote:
> > What status close() returns depends on your awk. And closing a pipe
> > should wait for the command to finish so if you've a grep that
> > produces 1E6 lines and you just want to read the first then pipe it
> > into `sed 1q' before reading it into awk so the gr
Walter Harms wrote, quoting Ralph Corderoy:
>> ... if you've a grep that produces
>> 1E6 lines and you just want to read the first then pipe it into
>> `sed 1q' before reading it into awk so the grep gets a SIGPIPE on
>> writing to the finished sed, curtailing its largess.
>
> To access head or tai
Hi again Larry,
> What status close() returns depends on your awk. And closing a pipe
> should wait for the command to finish so if you've a grep that
> produces 1E6 lines and you just want to read the first then pipe it
> into `sed 1q' before reading it into awk so the grep gets a SIGPIPE on
>