sammccall added a comment. Some notes before our meeting.
It does appear that it's possible to generate duplicate forest nodes in this way, and AFAIK any method other that explicitly deduplicating creating using a map<(rule, rhsnodes), sequencenode> is going to have this problem. The good news is: - the cache lifetime is local to the family (outer batch), collisions necessarily happen within a family (outer batch). (We can make it a member and clear it, to reuse storage) - I think we can form the forest nodes at the bottom of the enumerateReducePaths() call, and use them + base GSS node in place of the reduce path. (we never actually used internal gss path nodes, just the forest nodes + GSS base). - Now reductions found by enumerateReducePaths() is identified by (base, sequence node, new state) which are cheap to compare/group in various ways I feel like this should simplify subsequent steps, but need to think more Repository: rG LLVM Github Monorepo CHANGES SINCE LAST ACTION https://reviews.llvm.org/D121368/new/ https://reviews.llvm.org/D121368 _______________________________________________ cfe-commits mailing list cfe-commits@lists.llvm.org https://lists.llvm.org/cgi-bin/mailman/listinfo/cfe-commits