Steven D'Aprano wrote: > Twice in a couple of weeks, I have locked up my PC by running a Python 2.5 > script that tries to create a list that is insanely too big. > > In the first case, I (stupidly) did something like: > > mylist = [0]*12345678901234 > > After leaving the machine for THREE DAYS (!!!) I eventually was able to > get to a console and kill the Python process. Amazingly, it never raised > MemoryError in that time. > > The second time was a little less stupid, but not much: > > mylist = [] > for x in itertools.combinations_with_replacement(some_big_list, 20): > mylist.append(func(x)) > > After three hours, the desktop is still locked up. I'm waiting to see what > happens in the morning before rebooting. > > Apart from "Then don't do that!", is there anything I can do to prevent > this sort of thing in the future? Like instruct Python not to request more > memory than my PC has? > > I am using Linux desktops; both incidents were with Python 2.5. Do newer > versions of Python respond to this sort of situation more gracefully?
If you are starting these scripts from the shell, how about ulimit? $ ulimit -v 40000 $ python -c'print range(10**5)[-1]' 99999 $ python -c'print range(10**6)[-1]' Traceback (most recent call last): File "<string>", line 1, in <module> MemoryError $ -- http://mail.python.org/mailman/listinfo/python-list