rsmith added a comment.

In D59885#1501774 <https://reviews.llvm.org/D59885#1501774>, @ilya-biryukov 
wrote:

> The suggested approach looks promising. The difference seems to be within the 
> noise levels on my machine:


:) Awesome!

> I guess it would make more sense to do the following before landing this:
> 
> - flip the flag, i.e. instead of reporting whether the token is "new" (the 
> common case), report whether it's "cached", i.e. coming from `CachedTokens` 
> or from a non-macro-expansion token stream.
> - set this flag only inside `CachingLex` and `TokenLexer`.
> 
>   The revision also removes `IsNewToken` from `CachingLex` and sets the flag 
> in the token instead. Any other suggestions?

Notionally, I think we should separate out "produces new tokens" from "is macro 
expansion" in `TokenLexer`. Please take a quick look at the various callers of 
`EnterTokenStream` (particularly the ones inside `Lex`; I'm not too worried 
about the ones in `Parse`) and see which (if any) of them actually intend to 
inject new "real" phase-4-of-translation tokens. If there are any (and maybe 
even if there aren't?), please add a flag on `EnterTokenStream` to specify 
whether the `TokenLexer` believes it's creating new tokens or replaying tokens 
that have been seen before.

We'll need to decide what to do about the annotation tokens that we create when 
we enter / leave / import modules (`Preprocessor::EnterAnnotationToken`). In 
some sense, those are "new" tokens -- and that's how the parser will interpret 
them -- but they didn't come from the original source file. I think either 
answer could be OK there, but I'm inclined to treat them as "new" because that 
gives more information to the token consumer.


Repository:
  rG LLVM Github Monorepo

CHANGES SINCE LAST ACTION
  https://reviews.llvm.org/D59885/new/

https://reviews.llvm.org/D59885



_______________________________________________
cfe-commits mailing list
cfe-commits@lists.llvm.org
https://lists.llvm.org/cgi-bin/mailman/listinfo/cfe-commits

Reply via email to