https://gcc.gnu.org/bugzilla/show_bug.cgi?id=117470
Bug ID: 117470 Summary: new expression invalid size handling Product: gcc Version: 15.0 Status: UNCONFIRMED Severity: normal Priority: P3 Component: c++ Assignee: unassigned at gcc dot gnu.org Reporter: jakub at gcc dot gnu.org Target Milestone: --- https://eel.is/c++draft/expr.new#8.6 says that for invalid sizes ::operator new{,[]} shouldn't be called (at least when not constant evaluating it), but either throw or result in nullptr. Now, for -fexceptions and new int[s] for VARYING size_t s we call __cxa_throw_bad_array_new_length which supposedly throws the right exception (haven't tested) if there is overflow or size too large, which looks good, but for new (std::nothrow) int[s]; my reading is that for -fexceptions it shouldn't throw, but result in nullptr, but we still call __cxa_throw_bad_array_new_length. For -fno-exceptions, we call ::operator new{,[]} always, but for the overflows/too large sizes with ~size_t(0) argument which supposedly should throw or return NULL; is that correct? I mean, when the standard says ::operator new{,[]} shouldn't be called... Also from the POV of the allocation DCE, calling ::operator new* when the standard says it shouldn't be called looks problematic, because then the optimization can happily remove the ::operator new*/::operator delete* pair.