On May 11, 2010, at 4:56 PM, mgarg...@escholar.com wrote:

1)
I'm using an Oracle 11g backend, have my sequences all set (with caching set to 0 in cayenne modeler), and it works fine until my program throws an
exception.  Now this exception is being thrown after I'm calling
commitChanges() on the dataContext, but the next time I run my application
it complains about a PK violation and I need to wipe my test DB before
going forward.  any ideas?

So are you trying to re-commit the same "dirty" objects again after an exception? Not sure if the Oracle sequences are transactional and are subject to rollback, but if they are, I guess the assigned PK values are no longer valid. We may need to test this scenario in Cayenne (Jira anyone?) and maybe reset the generated object PKs on rollback. In the meantime, you can rollback your ObjectContext on exception and start with fresh objects.

2)
I'm storing zipped file content in BLOBs... it seemed to be working fine yesterday, but now I get this (the files today might just be larger, these
files might get pretty big ~1.5GB)....

Exception in thread "Thread-7"
org.apache.cayenne.validation.ValidationException: [v.3.0RC2 Jan 30 2010
16:41:40] Validation failures: Validation failure for
com.mycomp.FileContentObject.content: "content" exceeds maximum allowed

Check max length for this column in Modeler. It looks like this is coming from that setting.

Andrus

Reply via email to