On Tue, 19 Jan 2010 11:26:43 -0500, Gerald Britton wrote: > Interestingly, I scaled it up to a million list items with more or less > the same results.
A million items is not a lot of data. Depending on the size of each object, that might be as little as 4 MB of data: >>> L = ['' for _ in xrange(10**6)] >>> sys.getsizeof(L) 4348732 Try generating a billion items, or even a hundred million, and see how you go. This is a good lesson in the dangers of premature optimization. I can't think how many times I've written code using a generator expression passed to join, thinking that would surely be faster than using a list comprehension ("save building a temporary list first"). -- Steven -- http://mail.python.org/mailman/listinfo/python-list