Thanks,

Since the beginning I am convinced that resultiterators are the right way,
unfortunately they are useless if the jdbc driver loads everything in memory
before giving access to the resultset.

I hope there's something to do about it because it's a showstopper for me.


Robert Zeigler-6 wrote:
> 
> Take a look at paginated queries (which isn't /quite/ what you want,  
> but may help you):
> 
> http://cayenne.apache.org/doc/paginated-queries.html
> 
> As well as ResultIterator:
> http://cayenne.apache.org/doc/iterating-through-data-rows.html
> 
> Robert
> 
> On May 15, 2009, at 5/159:21 AM , stefcl wrote:
> 
>>
>>
>> Thanks but my problem is not related to paging.
>>
>> I would like to be able to execute a select query which returns approx
>> 100000 rows with a single selectQuery and process them one by one,  
>> while
>> keeping only a few of them in memory (using the resultiterator).
>>
>> In jdbc, Statement.setFetchSize(1000) tells the jdbc driver to  
>> retrieve
>> results from the database 1000 at a time and ask for the next 1000  
>> as you're
>> iterating the resultset. Otherwise its default behavior is to  
>> retrieve the
>> 100'000 rows in memory before you can start iterating the resultset.
>>
>> It's not the same thing as inserting a TOP or a LIMIT clause in the  
>> query.
>>
>> Any help appreciated
>>
>>
>> Andrey Razumovsky wrote:
>>>
>>> Hi,
>>>
>>> Yes of course. You can use SelectQuery.setFetchLimit(int) and
>>> setFetchStart(int) methods. They do just that.
>>>
>>> Andrey
>>>
>>
>> -- 
>> View this message in context:
>> http://www.nabble.com/Big-selects-on-PostGres-%3A-Configuring-Statement.setFetchSize%28%29-of-a-selectquery-tp23555842p23560831.html
>> Sent from the Cayenne - User mailing list archive at Nabble.com.
>>
> 
> 
> 

-- 
View this message in context: 
http://www.nabble.com/Big-selects-on-PostGres-%3A-Configuring-Statement.setFetchSize%28%29-of-a-selectquery-tp23555842p23561312.html
Sent from the Cayenne - User mailing list archive at Nabble.com.

Reply via email to