https://gcc.gnu.org/bugzilla/show_bug.cgi?id=103502
--- Comment #3 from Andrew Pinski <pinskia at gcc dot gnu.org> --- (In reply to Stas Sergeev from comment #2) > (In reply to Andrew Pinski from comment #1) > > I think you misunderstood what precise means in this context really. > > "Higher levels correspond to higher accuracy (fewer false positives). " > > So was it a false-positive? There are still false negatives. You missed the first part: "Takes care of the common pun+dereference pattern in the front end: *(int*)&some_float. " Because GCC can optimize that pun+dereference pattern without not breaking the code, GCC decided it should not warn with =3.