Mike, a couple things you might take a look at:
Kafka Connect might be useful for writing to your DB. Your
consumer->producer->DB flow seems to fit well with Connect.
Transactions might be useful for ensuring that your application runs to
completion before committing records. It sounds like your
> However, we can't figure out a way to turn off key deserialization (if
that is what is causing this) on the kafka connect/connector side.
Set *key.converter* to the correct value for the source message.
https://www.confluent.io/blog/kafka-connect-deep-dive-converters-serialization-explained
-
Hello, hope you all are great today!
am using a Kafka Stream application:
...
*final* StreamsBuilder streamsBuilder = *new* StreamsBuilder();
*final* KStream, MyObject> myObjects =
streamsBuilder
.stream(inputTopicNames, Consumed.*with*(
myObjectsWindowSerde, myObjectsAv
Hi,
We have one application that produces various data ( dataset1, dataset 2 )
at various sites when our application is executed. Currently dataset1, dataset2
. is stored into separate *.json data files for each execution.
We don't want to transfer the data when we are running the applic