consuming a topic to backup events

2020-02-05 Thread Marcel Koopman
Hello, I've got an topic with the default retention rate of 7 days. Now on production we need to retrieve the events because of an incident. Can i just spin up a new consumer (no group id), offset beginning and consume the data? It should not have any effect on other running consumers right? --

Adding a new sub-topolgy requires reset?

2020-02-05 Thread Murilo Tavares
Hi I have a KafkaStreams application that's pretty simple, and acts as a repartitioner... It reads from input topics and send to output topics, based on a input-to-output topics map. It has a custom Repartitioner that will be responsible for assigning new partitions for the data in the output topic

Re: Adding a new sub-topolgy requires reset?

2020-02-05 Thread Murilo Tavares
The NPE in KafkaStreams 2.4.0 looke like this: 2020-02-05 14:54:09.392 [mico-repartitioner-665ab051-b233-4a45-88af-5dab135fde8b-StreamThread-2] INFO org.apache.kafka.streams.processor.internals.StreamThread - stream-thread [mico-repartitioner-665ab051-b233-4a45-88af-5dab135fde8b-StreamThread-2] S