Well, Wi-Fi and VPN will be slower than a more direct access (local
Ethernet or localhost). I still think breaking apart the query is a
good idea so you can do it in multiple DataContexts and manage the
memory footprint better (which may not be an issue for you).
mrg
On Sun, Apr 12, 2009 at 11:
Well this helped, I think the problem is my remote dev environment. The
same query ran locally on the MySQL server only takes a couple of
seconds. I think connecting through wifi wrapped in a VPN connection is
slowing it down. I'm going to try running my whole program on the MySQL
server and se
On 13/04/2009, at 10:42 AM, Paul Logasa Bogen II wrote:
I have a table with ~683k entries in it. I have a maintenance task
that requires me to hit every entry. Since I know the results are
much too big to return all at once, I've set the page size to 50.
However, Cayenne appears to be atte
So I should just bite the bullet and loop over subqueries using limit then?
Or is there a more Cayenne way to do it?
plb
Kevin Menard wrote:
Assuming monotonically increasing PKs, you shouldn't really need to pull all
the PKs in. SELECT, ORDER BY, and LIMIT will serve well enough.
sounds like you need to be using result iterator.
See:
http://cayenne.apache.org/doc/api/org/apache/cayenne/access/ResultIterator.html
Robert
On Apr 12, 2009, at 4/127:42 PM , Paul Logasa Bogen II wrote:
I have a table with ~683k entries in it. I have a maintenance task
that requires me to
Assuming monotonically increasing PKs, you shouldn't really need to pull all
the PKs in. SELECT, ORDER BY, and LIMIT will serve well enough.
--
Kevin
On Sun, Apr 12, 2009 at 8:52 PM, Michael Gentry wrote:
> Cayenne has to fetch the primary keys of your ~683k records first,
> which is why it is
Cayenne has to fetch the primary keys of your ~683k records first,
which is why it is taking so long. After that, it will use the PKs to
fetch all the records for each page (50 at a time in your case) you
access. Eventually you'll have all ~683k in memory (if you have
enough memory). This will b
I have a table with ~683k entries in it. I have a maintenance task that
requires me to hit every entry. Since I know the results are much too
big to return all at once, I've set the page size to 50. However,
Cayenne appears to be attempting the full query first before falling
back to paging. Th
The query refreshes the root entity, but not the relationships. So
MisysDict will be refreshed, while related Xrefs will not. To ensure a
refresh of specific relationships, you can use prefetching:
http://cayenne.apache.org/doc/prefetching.html
In the documentation it is presented as a perfo
I'm feeling thick, but I'm really stuck with what is becoming an
increasingly simple attempt to convince myself that I can get the simplest
of caching examples working.
My attempt now is to get two machines on two separate JVMs to have a
force-reload. To do this, I'm re-running the query that popul
10 matches
Mail list logo