https://gcc.gnu.org/bugzilla/show_bug.cgi?id=123180

--- Comment #2 from Jonathan Wakely <redi at gcc dot gnu.org> ---
We get away with it because __buf.begin() will only be null in extreme
circumstances where even allocating storage for a single value_type object
fails and returns null. If that happened, we would take the final else branch
and try to write to the null pointer, with undefined behaviour.

But what usually happens is that we either allocate the requested storage and
take the first branch (which is the optimized path), or we get some storage but
less than requested and take the second branch (which is the slowest path,
because it's intended for the case where we have *no* additional storage).

Definitely still a bug, but *in most cases* it only hurts performance and won't
have undefined behaviour.

#include <algorithm>
#include <ranges>
#include <cstdlib>
#include <cstdio>

std::size_t limit = -1u;

void* operator new(std::size_t n, const std::nothrow_t&) noexcept
{
  if (n > limit)
  {
    std::printf("nope %zu\n", n);
    return nullptr;
  }
  return std::malloc(n);
}

void* operator new(std::size_t n)
{
  return std::malloc(n);
}

void operator delete(void* p) noexcept { std::free(p); }
void operator delete(void* p, std::size_t) noexcept { std::free(p); }

int main()
{
  int a[]{ 4, 6, 2, 1, 7, 9, 2, 2, 2, 8 };
  std::puts("sorting with no limit on memory:");
  std::ranges::stable_sort(a); // gets as much memory as it needs
  limit = 2 * sizeof(int);
  std::puts("sorting with small limit on memory:");
  std::ranges::stable_sort(a); // only gets memory for two ints
  limit = 0;
  std::puts("sorting with zero limit on memory:");
  std::ranges::stable_sort(a); // gets no memory and crashes
}

On my machine this prints:

sorting with no limit on memory:
sorting with small limit on memory:
nope 20
nope 12
sorting with zero limit on memory:
nope 20
nope 12
nope 8
nope 4
Segmentation fault (core dumped)

Reply via email to