On Mon, Aug 29, 2022 at 11:35:44PM +0200, Jakub Jelinek wrote: > I guess I should try what happens with 0x110000 and 0x7fffffff in > identifiers and string literals.
It is rejected in identifiers, but happily accepted in string literals: const char32_t *a = U"����"; const char32_t *b = U"������"; int a����b = 1; int c������d = 2; Jakub