On 5/10/23, G. Branden Robinson <g.branden.robin...@gmail.com> wrote: > But it was interesting > to me to observe the performance of GNU troff and grotty with literally > billions of lines of input.
I agree, though it wasn't a completely realistic test, with each line having only one word on it: no filling, no adjusting (though those are probably cheap operations without a whole-paragraph algorithm). I hesitate to say that out loud for fear that Alex will feel compelled to generate an enormous input file of textual nonsense ("ChatGPT, give me a billion-word essay about everything on the internet") and overheat his CPU this time. >> So an INT_MAX-length terminal page would be INT_MAX / \n[.V] lines >> long. Though probably not even that, because a defensively coded >> "infinite" page length would be something like \n[INT_MAX]-2v.) > > Yes, to allow room for one blank line and the page footer. The footer would be included inside the page boundary, though, right? I was more thinking of guarding against round-off error, some other macro increasing the page length by a line (though in the context of a single macro package like -man, you can verify whether this ever happens), or other noise. I use \n[INT_MAX]u-1v in my own nroff settings because I had a problem with INT_MAX itself, though I failed to record what that problem was and no longer remember. > As for why these values are what they are, I have only a semi-educated > guess. Your guesses sound quite plausible! > Further > inter-word and inter-sentence space is quantified in twelfths of an > em as well (the `ss` request). Nonintuitively, .ss's units aren't in fractions of standard typographical measurements, but 1/12 of the current font's ordinary space. (Actually I guess this makes some sense, since the point of the request is to adjust the width of the space, so you'd want your baseline to be whatever space width the typeface designer chose for that typeface.) So for a 1/4-em space, as the Times family uses, .ss's units are 1/48 of an em. > Oh, and incidentally, I did a Fermi estimate of how many Encyclopedia > Britannicas would be the equivalent of a 2 billion-line man page, and > the answer is: about ten. (An EB has about 40 million words.) I'm not > worried about a man page approaching that limit. But woe unto any crackpot who decides to concatenate all the perldoc documents into one file and sed some man markup into it.