> I need to read a large amount of data into a list. So I am trying to > see if I'll have any memory problem. When I do > x=range(2700*2700*3) I got the following message: > > Traceback (most recent call last): > File "<stdin>", line 1, in ? > MemoryError > > Any way to get around this problem? I have a machine of 4G memory. The > total number of data points (float) that I need to read is in the order > of 200-300 millions.
While others on the list have given you options for how to accommodate this monstrosity, you've not mentioned what you intend to do with the data once you've shoveled it all into ram. Often, the easiest way to solve the problem is to prevent it from happening in the first place. Is there any way to operate on your data in a stream-oriented fashion? Or use a database filestore underneath? This would allow you to operate on a much smaller scale, and perhaps simply gather some aggregate statistics while skimming along the data stream. -tkc -- http://mail.python.org/mailman/listinfo/python-list