On Tue, 17 Dec 2013 09:54:41 -0500, Roy Smith wrote: > In article <mailman.4286.1387291924.18130.python-l...@python.org>, > Neil Cerutti <ne...@norwich.edu> wrote: > >> On 2013-12-17, Steven D'Aprano >> <steve+comp.lang.pyt...@pearwood.info> wrote: >> > I would really like to see good quality statistics about bugs per >> > program written in different languages. I expect that, for all we >> > like to make fun of COBOL, it probably has few bugs per >> > unit-of-useful-work-done than the equivalent written in C. > > Well, there was that little Y2K thing...
Oh come on, how were people in the 1990s supposed to predict that they would be followed by the year 2000??? That's a good point, but that wasn't a language issue, it was a program design issue. Back in the 70s and 80s, when saving two digits per date field seemed to be a sensible thing to do, people simply didn't imagine that their programs would still be used in the year 1999[1]. That's not the same sort of bug as (say) C buffer overflows, or SQL code injection attacks. It's not like the COBOL language defined dates as having only two digits. [1] What gets me is that even in the year 1999, there were still programmers writing code that assumed two-digit years. I have it on good authority from somebody working as an external consultant for a bank in 1999 that he spent most of 1998 and 1999 fixing *brand new code* written by the bank's own staff. You'd think that having lived through that experience would have shaken his belief that private enterprise does everything better, and the bigger the corporation the better they do it, but apparently not. Go figure. -- Steven -- https://mail.python.org/mailman/listinfo/python-list