Thanks Bruno.. yeah, I think I could figure it out... For dependencies such
as database, for which all the events will be blocked, we are planning to
put a retry mechanism, so processing will wait until the database
connection is backup. If the problem is with the incoming event, like bad
format et
Hi Pushkar,
if you do not want to lose any event, you should cache the events
somewhere (e.g. a state store) in case there is an issue with an
external system you connect to (e.g. database issue). If the order of
the event is important, you must ensure that the events in your cache
are proces
Thank you Gilles..will take a look..
Bruno, thanks for your elaborate explanation as well... however it
basically exposes my application to certain issues..
e.g. the application deals with agent states of a call center, and where
the order of processing is important. So when agent is logged in th
Hi Pushkar,
Uber has written about how they deal with failures and reprocessing here,
it might help you achieve what you describe:
https://eng.uber.com/reliable-reprocessing/.
Unfortunately, there isn't much written documentation about those patterns.
There's also a good talk from Confluent's Ant
Hi Pushkar,
I think there is a misunderstanding. If a consumer polls from a
partition, it will always poll the next event independently whether the
offset was committed or not. Committed offsets are used for fault
tolerance, i.e., when a consumer crashes, the consumer that takes over
the work
Bruno,
So, essentially, we are just waiting on the processing of first event that
got an error before going ahead on to the next one.
Second, if application handles storing the events in state store for retry,
Kafka stream would essentially commit the offset of those events, so next
event will be
Pushkar,
I don't know if this meets your needs, but I recently implemented something
similar in Samza (which I would classify as a hack, but it works); my
solution included:
- check for the pause condition on each message
- if the condition is met, then go into a while-true-sleep loop
- alternate
Hi Pushkar,
If you want to keep the order, you could still use the state store I
suggested in my previous e-mail and implement a queue on top of it. For
that you need to put the events into the store with a key that
represents the arrival order of the events. Each time a record is
received fr
Bruno,
1. the loading of topic mapped to GlobalKTable is done by some other
service/application so when my application starts up, it will just sync a
GlobalKTable against that topic and if that other service/application is
still starting up then it may not have loaded that data on that topic and
t
Thank you for clarifying! Now, I think I understand.
You could put events for which required data in the global table is not
available into a state store and each time an event from the input topic
is processed, you could lookup all events in your state store and see if
required data is now av
Say the application level exception is named as :
MeasureDefinitionNotAvaialbleException
What I am trying to achieve is: in above case when the event processing
fails due to required data not available, the streams should not proceed on
to next event, however it should wait for the processing of c
It is not a kafka streams error, it is an application level error e.g. say,
some data required for processing an input event is not available in the
GlobalKTable since it is not yet synced with the global topic
On Mon, Sep 21, 2020 at 4:54 PM Bruno Cadonna wrote:
> Hi Pushkar,
>
> Is the error y
Hi Pushkar,
Is the error you are talking about, one that is thrown by Kafka Streams
or by your application? If it is thrown by Kafka Streams, could you
please post the error?
I do not completely understand what you are trying to achieve, but maybe
max.task.idle.ms [1] is the configuration yo
Hi,
I would like to know how to handle following scenarios while processing
events in a kafka streams application:
1. the streams application needs data from a globalKtable which loads it
from a topic that is populated by some other service/application. So, if
the streams application starts getti
14 matches
Mail list logo