On Wed, May 03, 2006 at 02:52:57PM -0400, Sven Willenberger wrote:
> Using identical data and identical queries, why would the amd64 system
> using postgresql 8.1.3 be using some 2/3s more memory to store the query
> results before output than the i386 system using postgresql 8.0.4?
Is the amd64 s
Sven Willenberger wrote:
> OK, that I do see; I guess I never noticed it on the other (i386)
> machine as the memory never exceeded the max amount allowed by the
> tunables. That raises a question though:
>
> Using identical data and identical queries, why would the amd64 system
> using postgresq
On Wed, 2006-05-03 at 13:16 -0400, Douglas McNaught wrote:
> Sven Willenberger <[EMAIL PROTECTED]> writes:
>
> > On Sat, 2006-04-22 at 15:08 -0400, Tom Lane wrote:
> >> Francisco Reyes <[EMAIL PROTECTED]> writes:
> >> > What resource do I need to increase to avoid the error above?
> >>
> >> Proce
Sven Willenberger <[EMAIL PROTECTED]> writes:
> On Sat, 2006-04-22 at 15:08 -0400, Tom Lane wrote:
>> Francisco Reyes <[EMAIL PROTECTED]> writes:
>> > What resource do I need to increase to avoid the error above?
>>
>> Process memory allowed to the client; this is not a server-side error.
>>
>
>
On Sat, 2006-04-22 at 15:08 -0400, Tom Lane wrote:
> Francisco Reyes <[EMAIL PROTECTED]> writes:
> > What resource do I need to increase to avoid the error above?
>
> Process memory allowed to the client; this is not a server-side error.
>
I am experiencing an "out of memory" situation as well o
Francisco Reyes <[EMAIL PROTECTED]> writes:
> What resource do I need to increase to avoid the error above?
Process memory allowed to the client; this is not a server-side error.
regards, tom lane
---(end of broadcast)---
TI
What resource do I need to increase to avoid the error above?
Trying to do a straight select against a table with 6 million records.
So far tried increasing SHMMAX to 512MB
---(end of broadcast)---
TIP 6: explain analyze is your friend
Thanks, everyone. I got it to work! Here is my solution hoping it is
useful to the next programmer.
PROBLEM: Perl DBI for Postgres does not implement cursors. All query
results are cached in memory. For very large result sets, this give the
"out of memory for query result" message.
The prepa
On Sat, Oct 22, 2005 at 06:15:59PM -0400, Allen Fair wrote:
> From my googling, it seems the Perl DBD driver for Postgres does *not*
> support the cursor (see below). I hope someone can refute this!
>
> I am otherwise looking for code to implement Postgres cursors in Perl. I
> can not find the "
Allen Fair <[EMAIL PROTECTED]> writes:
> From my googling, it seems the Perl DBD driver for Postgres does
> *not* support the cursor (see below). I hope someone can refute this!
>
> I am otherwise looking for code to implement Postgres cursors in
> Perl. I can not find the "DECLARE CURSOR" defin
From my googling, it seems the Perl DBD driver for Postgres does *not*
support the cursor (see below). I hope someone can refute this!
I am otherwise looking for code to implement Postgres cursors in Perl. I
can not find the "DECLARE CURSOR" defined in the Perl DBI documentation
either. Thanks
On Sat, Oct 22, 2005 at 03:46:18PM -0400, Allen wrote:
> I am trying to select a result set from a 2-table join, which should be
> returning 5,045,358 rows. I receive this error:
>
> DBD::Pg::st execute failed: out of memory for query result
AFAIK, DBD:Pg never uses a cursor unless you ask i
I am trying to select a result set from a 2-table join, which should be
returning 5,045,358 rows. I receive this error:
DBD::Pg::st execute failed: out of memory for query result
I am using Perl with DBI cursor (so i think) to retreive the data
(prepare, execute, fetchrow_hashref, ..., fi
On Fri, Apr 29, 2005 at 10:58:08AM -0600, Michael Fuhr wrote:
> Have you considered using a cursor to fetch the query results? That
> should prevent the API from trying to load the entire result set
> into memory.
I can do that. I didn't know the API would try to load the entire
result set int
On Fri, Apr 29, 2005 at 10:47:36AM -0500, [EMAIL PROTECTED] wrote:
>
> DBD::Pg::st execute failed: out of memory for query result
Have you considered using a cursor to fetch the query results? That
should prevent the API from trying to load the entire result set
into memory.
--
Michael Fuhr
htt
I have a Postgis table with about 2 million polyline records. The most
number of points I have in the geometry field is about 500. I have a
simple DBD::Pg Perl program that does a select for most of these records
and do some processing with them before writing them to a file.
Unfortunately, I see
16 matches
Mail list logo