http://gcc.gnu.org/bugzilla/show_bug.cgi?id=49885

--- Comment #6 from Daniel Kraft <domob at gcc dot gnu.org> 2011-07-29 17:00:12 
UTC ---
For the record, with the original test case and -fdump-tree-original, I get:

s (integer(kind=4) & restrict nb)
{
  character(kind=1) bad_rec[1][1:.bad_rec];
  integer(kind=4) .bad_rec;
  bitsizetype D.1567;
  sizetype D.1568;
  bitsizetype D.1569;
  sizetype D.1570;

  {
    integer(kind=4) D.1566;
    integer(kind=4) M.2;

    M.2 = 80;
    D.1566 = *nb;
    if (D.1566 > M.2)
      {
        M.2 = D.1566;
      }
    .bad_rec = MAX_EXPR <M.2, 0>;
    D.1567 = (bitsizetype) (sizetype) NON_LVALUE_EXPR <.bad_rec> * 8;
    D.1568 = (sizetype) NON_LVALUE_EXPR <.bad_rec>;
    D.1569 = NON_LVALUE_EXPR <SAVE_EXPR <D.1567>> + 7 & -8;
    D.1570 = NON_LVALUE_EXPR <SAVE_EXPR <D.1568>>;
        character(kind=1) bad_rec[1][1:.bad_rec];
  }
...some code that does the write...

.bad_rec has the correct value (80) later on in the print-statement.  However,
the code above (last line before }) looks to me as if bad_rec was created just
inside the shown block -- and is later no more available on the stack or
something, thus crashing the program.

So it seems that evaluating the max(...) creates the block shown above, and the
allocation of the automatic array is accidentally placed inside that block. 
This seems like a plausible cause for this bug.

Now I'll try to find out how to fix it.

Reply via email to