On Fri, May 11, 2012 at 3:29 PM, gry <georgeryo...@gmail.com> wrote: > sys.version --> '2.6 (r26:66714, Feb 21 2009, 02:16:04) \n[GCC 4.3.2 > [gcc-4_3-branch revision 141291]] > I thought this script would be very lean and fast, but with a large > value for n (like 150000), it uses 26G of virtural memory, and things > start to crumble. > > #!/usr/bin/env python > '''write a file of random integers. args are: file-name how-many''' > import sys, random > > f = open(sys.argv[1], 'w') > n = int(sys.argv[2]) > for i in xrange(n): > print >>f, random.randint(0, sys.maxint) > f.close() > > What's using so much memory?
I don't know, I'm not able to replicate the problem you're reporting. When I try your script with a value of 150000, it runs in under a second and does not appear to consume any more virtual memory than what is normally used by the Python interpreter. I suspect there is something else at play here. > What would be a better way to do this? (aside from checking arg > values and types, I know...) I don't see anything wrong with the way you're currently doing it, assuming you can solve your memory leak issue. Ian -- http://mail.python.org/mailman/listinfo/python-list