https://gcc.gnu.org/bugzilla/show_bug.cgi?id=101806
Andrew Pinski changed:
What|Removed |Added
Assignee|unassigned at gcc dot gnu.org |pinskia at gcc dot
gnu.org
La
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=101806
Andrew Pinski changed:
What|Removed |Added
CC||acoplan at gcc dot gnu.org
--- Comment
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=101806
--- Comment #3 from Andrew Pinski ---
Even a simple:
unsigned char g(unsigned char a, unsigned char b)
{
return ((~a) & b)&1;
}
Produces the extra zero extend.
But it is ok with:
unsigned char g(unsigned char *a, unsigned char *b)
{
return
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=101806
--- Comment #2 from Andrew Pinski ---
I think this will be fixed/improved by
https://gcc.gnu.org/pipermail/gcc-patches/2022-September/602089.html .
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=101806
--- Comment #1 from Andrew Pinski ---
It happens to work on x86-64(with -march=skylake-avx512) becausewe get a
zero_extend instead of an and there. I still don't understand how x86 is able
to figure out the &1 part.
Trying 11, 9 -> 12:
11: r