http://gcc.gnu.org/bugzilla/show_bug.cgi?id=49382
Jakub Jelinek <jakub at gcc dot gnu.org> changed: What |Removed |Added ---------------------------------------------------------------------------- CC| |aoliva at gcc dot gnu.org --- Comment #1 from Jakub Jelinek <jakub at gcc dot gnu.org> 2011-06-13 19:27:51 UTC --- I'm afraid that without statement frontiers there is nothing that can be done about it, and it is a question whether to call it a bug at all. The side effect (which has been optimized away) just happens so early that there are no real insns left before it. x_2 = x_1(D); # DEBUG x => x_1(D) + 1 clobber (x_2); i = 1; return; simply becomes: # DEBUG x => x_1(D) + 1 clobber (x_1(D)); i = 1; return; Claiming the side effect happens after the call wouldn't be correct, the side effect really happens before the call and the call is the first insn. With statement frontiers you could step through the zero-length insn between start of the function (where x would be live in DW_OP_reg5) to the location before the first call (where x would be already DW_OP_breg5 <1> DW_OP_stack_value).