On 2012-10-23, Steven D'Aprano <steve+comp.lang.pyt...@pearwood.info> wrote:

> I would be very surprised if the poster will be able to fit 100
> gigabytes of data into even a single list comprehension, let alone
> two.
>
> This is a classic example of why the old external processing
> algorithms of the 1960s and 70s will never be obsolete. No matter how
> much memory you have, there will always be times when you want to
> process more data than you can fit into memory.

Too true.  One of the projects I did in grad school about 20 years ago
was a plugin for some fancy data visualization software (I think it
was DX: http://www.research.ibm.com/dx/). My plugin would subsample
"on the fly" a selected section of a huge 2D array of data in a file.
IBM and SGI had all sorts of widgets you could use to sample,
transform and visualize data, but they all assumed that the input data
would fit into virtual memory.

-- 
Grant Edwards               grant.b.edwards        Yow! I Know A Joke!!
                                  at               
                              gmail.com            
-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to