Eric V. Smith <e...@trueblade.com> added the comment:

I don't think it really matters what other languages do. We're not designing 
this from scratch. We need to reflect that state we're in, which is many, many 
lines of working code using 'j'. I see the two options as: support 'i' and 'j', 
or break existing code and switch to 'i'. I don't think either option makes 
sense.

----------
title: Use normal 'i -> Use normal 'i' character to denote imaginary part of 
complex numbers

_______________________________________
Python tracker <rep...@bugs.python.org>
<https://bugs.python.org/issue43025>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to