Hi, I have the following problem: My Kafka consumer is consuming messages, but the processing of the message might fail. I do not want to retry until success, but instead want to quickly consume the next message. However at a later time I might still want to reprocess the failed messages. So I though about storing a list of offsets of the messages that have failed in the first try for later processing. But that would only make sense, if the offsets are unique, immutable identifiers for a message within a topic. Since Kafka deletes messages or compactifies the log after some time, I was wondering if this is really the case? If not, how could I then uniquely identify a message within a topic, so that a consumer knows from where to start consuming again?
Thank you, Andreas Maier AS ideAS Engineering Axel-Springer-Straße 65 10888 Berlin Mobil: +49 (0) 151 730 26 414 andreas.ma...@asideas.de Axel Springer ideAS Engineering GmbH Ein Unternehmen der Axel Springer SE Sitz Berlin, Amtsgericht Charlottenburg, HRB 138466 B Geschäftsführer: Daniel Keller, Niels Matusch