https://gcc.gnu.org/bugzilla/show_bug.cgi?id=103881

--- Comment #3 from thomas at habets dot se ---
Interesting.

So the difference between "x |= a & a" and "x |= f() & f()" is that the latter
has passed a somewhat arbitrary level of complexity after which GCC is not able
to prove that it's safe, and therefore warns as it being potentially losing
precision?

It's understandable, but unfortunate. It means that I have no hope of having
real world programs be free of false positives for conversion warnings.

Reply via email to