https://gcc.gnu.org/bugzilla/show_bug.cgi?id=94247
--- Comment #8 from Andrew Pinski ---
(In reply to Martin Sebor from comment #7)
> (In reply to Jakub Jelinek from comment #6)
> > No, it diagnoses the main bug
>
> Nope, it does not. -Wchar-subscripts is designed and documented to diagnose
> a
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=94247
--- Comment #7 from Martin Sebor ---
(In reply to Jakub Jelinek from comment #6)
> No, it diagnoses the main bug
Nope, it does not. -Wchar-subscripts is designed and documented to diagnose a
common cause of a bug. The actual bug itself (which,
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=94247
Jakub Jelinek changed:
What|Removed |Added
CC||jakub at gcc dot gnu.org
--- Comment #6
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=94247
Martin Sebor changed:
What|Removed |Added
CC||msebor at gcc dot gnu.org
--- Comment #5
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=94247
--- Comment #3 from Richard Biener ---
Yes, it's bad programming practice to use 'char' for any arithmetic.
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=94247
Richard Biener changed:
What|Removed |Added
Resolution|--- |INVALID
Status|UNCONFIRMED
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=94247
--- Comment #2 from Roland Illig ---
(In reply to Andrew Pinski from comment #1)
> >and the compiler already knows this
> Not when the warning is generated from the front-end. It does not know the
> range of the char variable there.
Ah, that fi
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=94247
Andrew Pinski changed:
What|Removed |Added
Keywords||diagnostic
--- Comment #1 from Andrew Pi