On Jan 17, 2005, at 11:44 PM, Chris Ochs wrote:
Also ask yourself if you really need all of the data at once. You may be able to filter it down in the query or build some incremental data structures using row-by-row iteration instead of fetchall_arrayref.
Ya I do, it's basically a customer list export from the database that I write out to a temp file, archive using zip, and then print it to the browser. However I am just iterating over a fetchrow, not using fetchall. DBD::Pg sticks everything into memory when you run the query, before you fetch anything.
Chris,
In this particular case, could you grab just the unique IDs for the rows of interest for your first query and then iterate over them (similar to what Class::DBI does)? Depending on the row size, this may save a good deal of memory. It will obviously be a performance hit....
Sean