We are using Kafka Connect to stream from a database with a JDBC Connector. Some row were wrongly deleted, therefore we have our key-value stores that are stale.
We thought we could solve the problem by using kafka-avro-console-producer and produce a message with the deleted key and the null payload so to trigger eviction from the view, however we didn't manage to When using kafka-avro-console-producer like so: kafka-avro-console-producer --broker-list localhost:9092 \ --topic myTopic \ --key-serializer org.apache.kafka.common.serialization.IntegerSerializer \ --property "parse.key=true" \ --property "key.separator=:" \ --property "schema.registry.url=http://my-schema-regisrtry" \ --property "value.schema=avroSchema" we got error "Must provide the Avro schema string in value.schema". Our Key is a pure INT, so it's not AVRO. Can we produce non avro-keys with avro-kafka-console-producer? When using kafka-console-producer, instead like so kafka-console-producer --broker-list localhost:9092 \ --topic myTopic \ --key-serializer org.apache.kafka.common.serialization.IntegerSerializer \ --property "parse.key=true" \ --property "key.separator=:" and inserting 12213:null, null was treated as a byte array and broke our deserializer What is the right way to fix our stale KTABLE? Thank you Edmondo