On Sun, 22 Dec 2024 at 19:17, Gilmeh Serda via Python-list <python-list@python.org> wrote: > > Was just playing with numbers and stumbled on something I've never seen > before. ... > > >>> 9**9**4 > Traceback (most recent call last): > File "<stdin>", line 1, in <module> > ValueError: Exceeds the limit (4300 digits) for integer string conversion; > use sys.set_int_max_str_digits() to increase the limit > > Explanation: > https://discuss.python.org/t/int-str-conversions-broken-in-latest-python-bugfix-releases/18889
I think that the original security concern was mainly motivated by the string to int direction i.e. calling int(s) for a possibly large string s (possibly from an untrusted source) might be slow with CPython. To solve that problem conversions from string->int and int->string were disallowed. Now that more time has passed it becomes clearer that disabling int->string conversion is more likely to be the thing that people bump into as a result of this limitation (as you just did). I find it harder to see what the security problem is in that direction but I don't think this will be changed. CPython has an implementation of arbitrarily large integers but an important part of it is hobbled. If you do want to work with such large integers then I recommend using either gmpy2's gmpy2.mpz type or python-flint's flint.fmpz type. At the same time it is not hard to run into slowness with integers e.g. 10**10**10 but that won't come up in string parsing if not using eval. Not a likely security issue but I am suddenly reminded of this dangerous snippet: x = [0]; x.extend(iter(x)) If you want to test it then make sure to save your work etc and be prepared to hard reset the computer. On this machine Ctrl-C doesn't work for this but Ctrl-\ does if you do it quick enough. -- Oscar -- https://mail.python.org/mailman/listinfo/python-list