"Andrew Pinski \(QUIC\) via Gcc" <gcc@gcc.gnu.org> writes:

> Deprecating complex integer types in GCC seems like a good idea. There
> has been issues with division with them before and it was raised back
> them maybe we should deprecate their support.
> The previous discussion about deprecating them can be found
> https://gcc.gnu.org/legacy-ml/gcc/2001-11/msg00790.html and
> https://gcc.gnu.org/bugzilla/show_bug.cgi?id=2995 .
> They were never standardized nor I doubt many folks use them.
> This also came up since clang/LLVM has similar issues to GCC previously had 
> too: https://github.com/llvm/llvm-project/issues/104738 .
> I doubt many folks use them or even know it is an supported GNU extension 
> either so deprecating them for GCC 15 seems like a decent idea.
>
> Any thoughts on this? And possibly removing support in GCC 16?

Just as an anecdote, they caused trouble for Emacs developers not so
long ago:

https://lists.gnu.org/archive/html/emacs-devel/2024-07/msg00004.html

As changing 0 (or 1) to "i" is such a common C idiom, that typo is not
going to be rare, so adding, at least, a warning option analogous to
clang's (undocumented?) -Wgnu-imaginary-constant
(https://lists.gnu.org/archive/html/emacs-devel/2024-07/msg00030.html)
would be a good idea.

Gaussian integers are useful sometimes, but I'm not so sure about
Gaussian ints, truncated to 32 or 64 bits.  I suspect most users would
be much happier with unlimited-precision libraries that avoid the
truncation.

Pip Cet

Reply via email to