https://gcc.gnu.org/bugzilla/show_bug.cgi?id=98226

--- Comment #6 from Oleg Zaikin <zaikin.icc at gmail dot com> ---
(In reply to Jonathan Wakely from comment #2)
> Oh, but you didn't enable any optimization at all, so who cares about the
> performance?

Let me give the whole picture. The issue is very close to that from
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=97759 where a C-style
implementation was replaced by C++20 std::has_single_bit. We have a complex
project that of course is compiled with all proper optimization flags. There we
have a function firstzero(unsigned x) that returns 2^i, with i the position of
the first 0 of x, and 0 iff there is no 0. Its implementation is:
  unsigned firstzero(const unsigned x) noexcept {
#if __cplusplus > 201703L
    return x == unsigned(-1) ? 0 : unsigned(1) << std::countr_one(x);
#else
    const unsigned y = x+1; return (y ^ x) & y;
#endif
  }
When we switched from C++17-based g++ to C++20-based g++, the performance of
the whole program decreased by about 7 %. It turned out that the main reason is
the firstzero function.

Reply via email to