https://gcc.gnu.org/bugzilla/show_bug.cgi?id=79981
--- Comment #8 from Marc Glisse <glisse at gcc dot gnu.org> --- (In reply to Richard Biener from comment #2) > (simplify > (convert @1) > (if (INTEGRAL_TYPE_P (TREE_TYPE (@1)) > && INTEGRAL_TYPE_P (type) > && (TREE_CODE (type) == BOOLEAN_TYPE > || TYPE_PRECISION (type) == 1)) > (ne @1 { build_zero_cst (TREE_TYPE (@1)); }))) I thought casting int to _Bool, internally in gcc, had the same semantics as any other integer cast, i.e. keep only the low bit, which is not the same thing as != 0 (unless get_range_info says so). Am I misremembering?