Follow-up Comment #23, bug #63332 (group groff):

[comment #20 comment #20:]
> Do either of these sound reasonable to you?  Do you have a preference?

At the moment, this works, and it'd be a shame if it stopped (which, if I
understand correctly, both your proposals would do).

$ cat sad
.char \[extra_sad] **\[sad]**
.char \[sad] :-(
I am \[extra_sad].
$ groff -Tascii sad | cat -s
I am **:-(**.

$ fgrep -v \( sad | groff -Tascii | cat -s
troff:<standard input>:2: warning: special character 'sad' not defined
I am ****.


Characters are evaluated at time of use, not time of definition, and groff has
worked this way for a long time.  (The .char requests are groff innovations,
so there's no further-back history to worry about.)  The documentation
strongly implies this is by design: "Every time C is to be output, CONTENTS is
processed in a temporary environment and the result encapsulated in a node."
I foresee a lot of breakage if .char validity is verified at time of
definition.  (For not a lot of gain, as character \[a] could be defined in
terms of a character \[b] that exists at the time of definition but not at a
later time of use.)

Is changing that an essential aspect of fixing the core bug identified in
comment #1?  It seems, in theory, like it shouldn't be.  In fact, it seems
like to core bug is that groff is validating the RHS _too_ closely rather than
not closely enough.  But I know sometimes theory collides with the groff
parser in unexpected ways.


    _______________________________________________________

Reply to this item at:

  <https://savannah.gnu.org/bugs/?63332>

_______________________________________________
Message sent via Savannah
https://savannah.gnu.org/

Attachment: signature.asc
Description: PGP signature

Reply via email to