https://gcc.gnu.org/bugzilla/show_bug.cgi?id=91092

--- Comment #16 from Florian Weimer <fw at gcc dot gnu.org> ---
(In reply to Vincent Lefèvre from comment #15)
> (In reply to Florian Weimer from comment #14)
> > (In reply to Vincent Lefèvre from comment #13)
> > > By "implicit function declarations", does this include K&R style
> > > declarations?
> > 
> > No, there is nothing implicit about them.
> 
> OK, but the issue is similar: in both cases, the parameters/arguments are
> not checked, yielding undefined behavior, so that they fall in the same
> class.

I wouldn't say that.  If the function has a return type of int and you use the
correct argument types, the behavior is well-defined.  It's just very easy to
create portability hazards this way.

> > > I've found out a few days ago that GMP still uses K&R style declarations,
> > > and that's in a configure script. The issue is that there is a potential
> > > type mismatch between the caller (long) and the callee (unsigned int), and
> > > GCC fails to generate "correct" code in such a case.
> > 
> > GNU CC has supported an extension for many, many years where a K&R function
> > *definition* with a prior function prototype in scope behaves exactly as a
> > prototype-style function definition.  (On some targets, the two have
> > substantially different ABIs, beyond how parameters are handled.)
> 
> Actually I meant K&R function definition (with no previous prototype).

If you call such functions (old-style function definitions without a prototype
in scope) via a prototype declaration, this will result in stack corruption on
some targets (notably little-endian POWER).  The stack corruption is often
subtle and hard to spot.

Reply via email to