Thanks Otavio!

Option 1 won't work because the Camel code, while it does consume a batch
based on those settings and iterates over the enumeration, it only ever
passes one at a time. I thought about hacking that class but when we update
camel versions, i'd have to remember i did that.  :) . We cannot use the
Aggregate EIP because the only real choice is in-memory and that has the
potential for data loss. Using an rdbms for the Aggregation does not work
in k8s and additionally we are trying to bulk update a db anyway.  Also,
the offset is not passed when, auto commit is on or off (forget which one)
and the correct combination to support aggregation, the value is not passed.

I will look at 2 again. I had dug in but it seemed to be pretty
complicated.  I did dig into the Spring code to see how they did it, but
that was complex and I ran out of time.

Mainly I didnt want to duplicate effort if anyone had started this. I know
you all are slammed.



On Wed, Mar 22, 2023 at 5:24 AM Otavio Rodolfo Piske <angusyo...@gmail.com>
wrote:

> Hi,
>
> Some ideas worth investigating:
>
> 1. Using a mix of current kafka client options (i.e.: like maxPollrecords,
> maxPollInterval, etc) along with aggregate EIP
> 2. Create your own KafkaClientFactory that wraps a custom Producer/Consumer
> wrapping the Spring Kafka Consumer.
>
> If none of this works, then I suggest opening a ticket with your
> suggestion. We have been pretty busy / overloaded with the work on Camel 4,
> so it's pretty easy for us to miss bug reports and interesting feature
> suggestions - as yours.If possible, try to provide a pseudo-code, a
> reproducer or a unit test that the community can look at and work with.
>
>
> Thanks
>
> On Wed, Mar 15, 2023 at 1:31 PM Mark Nuttall <mknutt...@gmail.com> wrote:
>
> > I know consuming Kafka messages in a batch is not currently supported. I
> > googled and I didn't find any real options.  I looked at hacking the
> Camel
> > classes to implement it and decided that it was too risky.
> >
> > So for now we are falling back to using a Spring Kafka Consumer. The
> issue
> > with that is I still want to use Camel because we need to do some complex
> > EIP processing and republishing of messages. And since Spring does not
> know
> > about Camel Context startup i have to do a hack to make it wait if the
> > CamelContext is not in the "started" state.
> >
> > Questions:
> >
> >    - Is it possible to use the Spring Kafka Consumer in the "From" and be
> >    managed by Camel (I think not but i figured i would ask).
> >    - Is implementing batch consumers in the Kafka component planned/in
> >    progress?
> >    - If none of these - any ideas to make it work with the Kafka
> component?
> >
>
>
> --
> Otavio R. Piske
> http://orpiske.net
>

Reply via email to