On Thursday, February 12, 2015 at 3:08:10 AM UTC-8, Fabien wrote: > ... what a coincidence then that a huge majority of scientists > (including me) dont care AT ALL about unicode. But since scientists are > not paid to rewrite old code, the scientific world is still stuck to > python 2.
I'm a scientist. I'm a happy Python 3 user who migrated from Python 2 about two years ago. And I use Unicode in my Python. In implementing some mathematical models which have variables like delta, gamma, and theta, I decided that I didn't like the line lengths I was getting with such variable names. I'm using δ, γ, and θ instead. It works fine, at least on my Ubuntu Linux system (and what scientist doesn't use Linux?). I also have special mathematical symbols, superscripted numbers, etc. in my program comments. It's easier to read 2x³ + 3x² than 2*x**3 + 3*x**2. I am teaching someone Python who is having a few problems with Unicode on his Windows 7 machine. It would appear that Windows shipped with a less-than-complete Unicode font for its command shell. But that's not Python's fault. -- https://mail.python.org/mailman/listinfo/python-list