The jdbc component in 2.14 has a streaming list type.
The sql component do not have that.

For the split you can write a java bean method that returns an
iterator, that iterates the list in chunks. Its actually what the
tokenizer 1000 does.

On Thu, Nov 27, 2014 at 11:51 AM, dermoritz <[email protected]> wrote:
> Because of memory limitation i need to split a result from sql-component
> (List<Map<column,value>>) into smaller chunks (some thousand).
>
> I know about
> from(sql:...).split(body()).streaming().to(...)
>
> and i also know
> .split().tokenize("\n", 1000).streaming()
>
> but the latter is not working with List<Map<>> and is also returning a
> String.
> Is there a out of the Box way to create those chunks? Or do i need to add a
> custom aggregator just behind the split? Or is there another way?
>
> thanks in advance
>
>
>
> --
> View this message in context: 
> http://camel.465427.n5.nabble.com/split-big-sql-result-in-smaller-chunks-tp5759697.html
> Sent from the Camel - Users mailing list archive at Nabble.com.



-- 
Claus Ibsen
-----------------
Red Hat, Inc.
Email: [email protected]
Twitter: davsclaus
Blog: http://davsclaus.com
Author of Camel in Action: http://www.manning.com/ibsen
hawtio: http://hawt.io/
fabric8: http://fabric8.io/

Reply via email to