Note that DataContext creation is cheap. What I typically do in situations like 
this is to periodically create a new data context that you can throw away. When 
it's gone, the associated objects will be gc'ed. Eg: you could periodically 
dump the "reading" data context and create a fresh one after every 1,000 
Reading objects or whatever makes sense for your use-case. If the frequency of 
readings transactions is low, it probably makes sense to create a new 
DataContext, import whatever objects you need into it (via localObject), create 
your readings, commit, then discard the DataContext.  If the readings are very 
frequent, then it makes sense to have a dedicated "readings" DataContext that 
you can periodically swap out.

HTH,

Robert

On Dec 11, 2011, at 12/1112:51 AM , Chris Murphy (www.strandz.org) wrote:

> I have a server application that continually reads data from sensors. At
> set intervals the data is summarized. This summary data is used to create
> Cayenne data objects of type Reading. A short transaction commits these
> Reading objects to the database, after which it is not important that they
> are held in memory - they were created only to be stored. After a long
> period of time their continual collection results in an 'OutOfMemory' JVM
> condition.
> 
> There are many objects of type Reading for another Cayenne data object
> called SubstancePoint. And there's a whole object graph going back from
> there. I basically want to keep the whole of the object graph in memory,
> except for these Reading objects.
> 
> Is there a way to 'disappear' data objects from the DataContext? For that
> is what I think would solve my problem. I have tried calling
> DataContext.unregisterObjects() post-commit on the Reading data objects I
> want evacuated from memory, but I can see that the leak is still going on.
> 
> Thank you ~ Chris Murphy

Reply via email to