Hi, I am ruuning a database behind a webserver and there is a table
which is huge. I need to pull data from this table and send to user
through http. If I use

select * from huge_table where userid = 100

It will return millions of records which exhuasts my server's memory.
So I do this:

select * from huge_table where userid = 100 limit 1000 offset 0
and then send the results to user, then

select * from huge_table where userid = 100 limit 1000 offset 1000
and then send the results to user, then

select * from huge_table where userid = 100 limit 1000 offset 2000
and then send the results to user,

Continue this until there is no records available

It runs great but it is kind of slow. I think it is because even I
need only 1000 records, the query search the whole table every time.

Is there a better way to do this?

Thank you.

ff




-- 
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

Reply via email to