to interesting times :)
..Theo
-Original Message-
From: Fred Moyer [mailto:[EMAIL PROTECTED]
Sent: Tuesday, January 18, 2005 2:11 PM
To: Chris Ochs
Cc: Peter Haworth; modperl@perl.apache.org
Subject: Re: DBI memory usage
>> > It looks like $r->child_terminate does what I need. In the
On Tue, 18 Jan 2005 10:12:06 -0800, Chris Ochs wrote:
> On Tue, 18 Jan 2005 10:43:02 +, Peter Haworth
> <[EMAIL PROTECTED]> wrote:
> > By using a cursor, you can specify exactly how much data you want
> > at a time:
>
> DBD::Pg doesn't support cursors. From the DBD::PG manpage:
>
> "Although
>> > It looks like $r->child_terminate does what I need. In the case of
>> > Postgresql it eats the memory when you execute the query, regardless
>> > of whether you actually fetch any results.
>>
>> That can be worked around, however. By using a cursor, you can specify
>> exactly how much data yo
>> > It looks like $r->child_terminate does what I need. In the case of
>> > Postgresql it eats the memory when you execute the query, regardless
>> > of whether you actually fetch any results.
>>
>> That can be worked around, however. By using a cursor, you can specify
>> exactly how much data yo
On Tue, 18 Jan 2005 10:43:02 +, Peter Haworth
<[EMAIL PROTECTED]> wrote:
> On Mon, 17 Jan 2005 17:46:17 -0800, Chris Ochs wrote:
> > It looks like $r->child_terminate does what I need. In the case of
> > Postgresql it eats the memory when you execute the query, regardless
> > of whether you ac
On Jan 17, 2005, at 11:44 PM, Chris Ochs wrote:
Also ask yourself if you really need all of the data at
once. You may be able to filter it down in the query or
build some incremental data structures using row-by-row
iteration instead of fetchall_arrayref.
Ya I do, it's basically a customer list exp
On Mon, 17 Jan 2005 17:46:17 -0800, Chris Ochs wrote:
> It looks like $r->child_terminate does what I need. In the case of
> Postgresql it eats the memory when you execute the query, regardless
> of whether you actually fetch any results.
That can be worked around, however. By using a cursor, you
>
> Also ask yourself if you really need all of the data at
> once. You may be able to filter it down in the query or
> build some incremental data structures using row-by-row
> iteration instead of fetchall_arrayref.
Ya I do, it's basically a customer list export from the database that
I write o
Is there a way to do large queries that return lots of data without
having my apache process grow by the equivalent size in ram of the
data returned?
Yes. Many databases support sending the results a few rows at a time
instead of all at once. For specific advice on this, you might check
your DBD
> > Yes. Many databases support sending the results a few rows at a time
> > instead of all at once. For specific advice on this, you might check
> > your DBD documentation or ask on the dbi-users list. You can also have
> > your program schedule the current apache process to exit after finishin
Perrin Harkins wrote:
Chris Ochs wrote:
Is there a way to do large queries that return lots of data without
having my apache process grow by the equivalent size in ram of the
data returned?
Yes. Many databases support sending the results a few rows at a time
instead of all at once. For specific
Chris Ochs wrote:
Is there a way to do large queries that return lots of data without
having my apache process grow by the equivalent size in ram of the
data returned?
Yes. Many databases support sending the results a few rows at a time
instead of all at once. For specific advice on this, you mi
Is there a way to do large queries that return lots of data without
having my apache process grow by the equivalent size in ram of the
data returned? The only thing I can think of is to run a separate
script outside of mod perl for queries like this.
Chris
13 matches
Mail list logo