On Sun, Mar 9, 2014 at 6:43 AM, matshyeq <matsh...@gmail.com> wrote:

> Hello,
>
> I've found an issue when tried to implement fetching rows from big table
> (2mln rows) in my app.
> Basically I don't find an elegant and easy way (other than always use
> cursors) to limit the number of rows returned.
> This causes my application to break due to the excessive memory
> consumption.
>
> I'm using Perl and DBD::Pg library but contacted maintainer who actually
> pointed out this is an issue that goes much deeper (libpq):
>
> "Unfortunately, this is a limitation in the underlying driver (libpq)
> rather than DBD::Pg itself. There have been talks over the years of
> supporting this, but nothing concrete yet. Your best bet would be to ask
> about this on the Postgres lists"
>

I don't think this is correct.  First, DBD::Pg could get tricky and
automatically wrap your query in a cursor and then fetch from the cursor
behind the scenes.  I believe that this is what Python's module does for
you in some modes.  Second, the feature needed to do this without even
using a cursor was added 1.5 years ago (PQsetSingleRowMode).  The DBD::Pg
was just not taught how to use it yet.

The first strategy could probably be done purely in Perl, the second would
require changes to the C parts of DBD::Pg.

Of course just because it can be implemented in DBD::Pg doesn't mean anyone
has an obligation to do it.  You could speed that along by contributing the
code yourself.  But I would say the ball is firmly in DBD::Pg's court.


Cheers,

Jeff

Reply via email to