I'm exporting a database table through a mod_perl2 handler.  The
problem is that for large tables, the size of the httpd process
balloons to consume alot of RAM.  For example, a 299mb MySQL table
(size of .MYD file), which creates a 35mb export, causes httpd to
consume about 220mb of RAM!

My code is fairly straightforward, so I must be missing something
about buffering or

while ( my $rowref = $sth->fetchrow_arrayref ) {
  $r->print $rowref->[0];
  # ...more $r->print statements for each field...
}

Thinking perhaps mod_perl is buffering the entire output, I've tried
adding "$r->rflush" after each row's print, and I also tried setting
"local $| = 1", but neither helped.

Any ideas on how to keep memory utilization under control?

thanks
JB

Reply via email to