int triangle(int a,int b) { int c; c=a*b/2; return c; } emits this very bizarre code (at -O, -O2): mov a,%edx mov b,%eax imull %edx,%eax movl %eax,%edx shrl $31,%edx addl %edx,%eax sarl %eax ret Why are the two instructions after the imull emitted? Shouldn't this become simply imull and sarl? This code extracts the most significant bit of a*b and adds it to a*b, then shifting the result right. It almost looks as if trying to round or something? There is probably something obvious I'm overlooking here. The analog code is generated for ppc, x86_64, sparc and mips. Please explain. I also tried with -Os, and the code becomes cltd (sign-extend 32 to 64 bits) plus idivl with 2. Could it be that the peephole optimizer converts the idivl to a shift but forgets to remove the sign extend code?
-- Summary: bizarre code for int*int/2 Product: gcc Version: 3.4.4 Status: UNCONFIRMED Severity: minor Priority: P2 Component: c AssignedTo: unassigned at gcc dot gnu dot org ReportedBy: felix-gcc at fefe dot de CC: gcc-bugs at gcc dot gnu dot org GCC build triplet: i686-pc-linux-gnu GCC host triplet: i686-pc-linux-gnu GCC target triplet: i686-pc-linux-gnu http://gcc.gnu.org/bugzilla/show_bug.cgi?id=22072