Re: Transactionally reading large files into a database without blowing up the Java heap

2025-07-15 Thread Håkan Lantz
Hello, We have now managed to verify a limited time and memory consumption in a solution that is implemented using two nested Split EIPs and passing the result to a SQL Producer endpoint. I.e. int batchSize = 1000; from("file:test?readLock=changed") .transacted("txRequired")

Re: Transactionally reading large files into a database without blowing up the Java heap

2025-07-11 Thread Claus Ibsen
Hi Its easier to help if you put a sample project somewhere on github and also make sure it can be build and run easily, and that you try with latest releases also. On Fri, Jul 11, 2025 at 9:34 AM Håkan Lantz wrote: > Hello, > > We have a large file that we would like to transactionally read ro

Transactionally reading large files into a database without blowing up the Java heap

2025-07-11 Thread Håkan Lantz
Hello, We have a large file that we would like to transactionally read row by row into a database using Bindy objects without blowing up the Java heap. Several attempts has been made using nested Split EIP and SQL producer with batch constructs. One of them we used a combination Split & Loop EIP