On Mon, Feb 28, 2005 at 09:24:20AM -0500, Robert Dewar wrote: > Not quite, Marc is suggesting that -pedantic be the default if I read > the above statement correctly.
Yep. Except it's probably too late for that, and there is stuff in -pedantic that is downright obnoxious (because every C compiler I know does it) and err... really pedantic, as opposed to actual warnings that help finding out about obscure extensions. In my opinion, this is just a case of a very bad design decision that was taken years ago. It took me years to grow a firm opinion about it too. The basic issue I'm talking about is failure modes for software. There are a few interesting error categories. - stuff that is an error but that the program can recover from. - stuff that is not really an error, but that the program can't find out about. The first class is interesting because it's a class we should not recover from, ever, for programming tools: it causes all sorts of grief all the time later down the line. Why ? because it's an ERROR, so it's not actually specified formally, and recovering from it gracefully muddles the semantics. Because some errors will be recovered, and some will not. And this might change from release to release, allowing half-broken software to grow and develop. For instance, the extern/static inline stuff in gcc falls under that line, in my book. GCC was not designed to put any hard checks that the inlined functions were also linked in when building the final executable, and so people like Torvalds complained when a later version of gcc did no longer inline the function and could not find it in a library. At the time the complaint came up, I came on the side of the GCC developers, that the extra feature was misused by linux developers... Now, I'm not so sure. I think that there was a design step missed along the guidelines of not allowing erroneous software to build. The second class is interesting because it comes up all the time with -Wall -Werror. All the `variable not initialized stuff' (that one is obvious). To a lesser extent, all the `comparison is always true due to limited range of data type'. Those warnings actually occur all the time in portable code, and are very hard to get rid of (and it's probably not a good idea to clean them all up). This makes -Wall -Werror much less useful than it could be. Forgive me if I'm reinventing the wheel (partly), but more and more, it seems to me that there's a category of warnings missing: the stuff that the compiler is sure about, and that cannot come from portability issues. Say, the -Wsurething warnings. If we could find reasonable semantics for these (along with a -Wunreasonable-extension switch), then maybe we would have something that -Werror could use. As far as practical experience goes, I've spent enough time dealing with OpenBSD kernel compilation (which does -Wall -Werror, btw) and with cleaning up various old C sources (which invariably start with a combination of warning switches, and then continues by reading pages of inappropriate warnings to find the right ones) to be fairly certain these kind of diagnostics could be useful... Oh yes, and the change from the old preprocessor to the new and improved cpplib took quite a long time to recover from too... You wouldn't believe how many people misuse token pasting all the time. But I put the effort because I think that's a good change: it takes unambiguously wrong code out in the backyard and shoots it dead.