> While there are programmers out there where any attempt at getting them > to write even remotely secure software, most of them can be taught the > concept of LIA-1 semantics, and a lot of them already know and write > software accordingly. Getting them to understand that an expression > like "if (a + b < a)" doesn't do what they would expect is harder. > Getting them to understand that "if (a < 0)" might fail on them in > obscure cases is close to impossible.
Really? It's easier to teach programmers that if they keep adding positive numbers, they'll eventually get some negative number than it is to teach them that if they keep adding positive numbers, it's undefined what happens when it overflows? This seems odd to me since so few programmers know machine-level operations anymore.