On Wed, Jun 29, 2016 at 2:51 PM, Lawrence D’Oliveiro <lawrenced...@gmail.com> wrote: > On Wednesday, June 29, 2016 at 4:20:24 PM UTC+12, Chris Angelico wrote: >>> https://www.jwz.org/blog/2010/10/every-day-i-learn-something-new-and-stupid/ >> >> """It would also be reasonable to assume that any sane language >> runtime would have integers transparently degrade to BIGNUMs, making >> the choice of accuracy over speed, but of course that almost never >> happens...""" >> >> Python 2 did this, but Python 3 doesn't. > > Huh? > > ldo@theon:~> python3 > Python 3.5.1+ (default, Jun 10 2016, 09:03:40) > [GCC 5.4.0 20160603] on linux > Type "help", "copyright", "credits" or "license" for more information. > >>> 2 ** 64 > 18446744073709551616 > >>> type(2 ** 64) > <class 'int'>
The transparent shift from machine-word to bignum is what no longer exists. Both Py2 and Py3 will store large integers as bignums; Py2 has two separate data types (int and long), with ints generally outperforming longs, but Py3 simply has one (called int, but functionally like Py2's long), and does everything with bignums. There's no longer a boundary - instead, everything gets the "bignum tax". How steep is that tax? I'm not sure, but microbenchmarking shows that there definitely is one. How bad is it in real-world code? No idea. ChrisA -- https://mail.python.org/mailman/listinfo/python-list