Not sure if that's an option for you, but upgrade to 3.0B1 is worse
trying. It fixes lots of rough edges around joint prefetching,
paginated queries, etc.
Andrus
On Nov 10, 2009, at 5:38 PM, Hans Pikkemaat wrote:
Hi,
It seems that when I set a page size on a regular query with join
pref
Hi,
It seems that when I set a page size on a regular query with join
prefetching, cayenne
checks if the number of records returned is equal to the page size.
Because the query which is generated contains the DISTINCT keyword
the duplicate entries are removed.
This then causes the exception.
Hi,
Thanks for your quick response!
I kind of expected that prefetches and iterated queries would not work,
thats the reason for this post :)
I tried the page size on a non iterated query but then I get a weird
exception that I don't
understand:
Exception in thread "main" org.apache.cayenn
Maybe I'm missing something, but i doubt prefetches are implemented for
iterated queries. I would try keeping only setPageSize(1000). It will do all
prefetches and result in same memory payload, because all datarows in your
example are turned into data objects, so they will all be cached in memory.
Hi,
I'll give you some insight in what I tried.
DataContext dataContext = this.createDataContext();
String sql = "select ... from table1 join table 2 ...";
SQLTemplate query = new SQLTemplate(Table1Object.class, sql);
query.addPrefetch(table2.RELATION_PROPERTY
Hi,
I think the best in your case would be using disjoint prefetching;
http://cayenne.apache.org/doc20/prefetching.html
This way both table data will be returned altogether (in same db row), so
you will be able to iterate though them at the same time.
I also highly encourage you to try today-rele
Hi,
My case:
I have a complex query which returns a huge amount of data. It returns
data from
two tables which are joined.
Because the amount of data is huge I cannot load all of it into memory.
For this reason I want to use an iterated query.
I also want to prevent cayenne from executing a
Hi,
My case:
I have a complex query which returns a huge amount of data. It returns
data from
two tables which are joined.
Because the amount of data is huge I cannot load all of it into memory.
For this reason I want to use an iterated query.
I also want to prevent cayenne from executing a