Short answer - you cannot. The existing data is not reprocessed since kafka
itself has no knowledge on how you did your partitioning.
The normal workaround is that you stop producers and consumers. Create a
new topic with the desired number of partitions. Consume the old topic from
beginning and w
Yes I understand that.
The streams application takes care of that
when I do:
input
.map(new KeyValueMapper>() {
public KeyValue apply(K key, V value) {
..
return new KeyValue(new_key, new_value);
}
}).through(k_serde, v_se
Hi
You can specify a partition function while producing a message to Kafka
brokers. This function will determine which partition the message should be
sent to.
See
https://edgent.apache.org/javadoc/r1.1.0/org/apache/edgent/connectors/kafka/KafkaProducer.html#publish-org.apache.edgent.topology.TStre
Hi,
I have a topic which has four partitions and data is distributed among
those based on a specified key.
If I want to increase the number of partitions to six how can I do the same
and also making sure that messages for a given key always go to one
(specific) partition only.
Will the existing m