dcoughlin added a comment.

It still seems like we are inferring invariants that are not sound. Do we need 
to restrict the symbolic transformation so that it only applies when A - B, 
too, is known to not overflow? Is that sufficient? Is it often the case that 
the analyzer knows these restrictions?

For example, in the following we infer `B - A` is in  `[127, 127]` at program 
point (1) but, for example, this invariant is not true when `A` is `-100` and 
`B` is `40`,

  void bar(signed char A, signed char B) {
    if (A + 127 < B + 1) {
      // (1)
    }
  }

Similarly we infer `A - B` is in  `[0, 127]` at program point (2) but, for 
example, this invariant is not true when `A` is `100` and `B` is `-40`,

  void foo(signed char A, signed char B) {
    if (A + 0 >= B + 0) {
      // (2)
    }
  }

You can see these invariants by running with the 'debug.ViewExplodedGraph' 
checker enabled.


https://reviews.llvm.org/D35109



_______________________________________________
cfe-commits mailing list
cfe-commits@lists.llvm.org
http://lists.llvm.org/cgi-bin/mailman/listinfo/cfe-commits

Reply via email to