Re: Is KTable cleaned up automatically in a Kafka streams application?

2018-04-19 Thread Mihaela Stoycheva
be consumed and that specific entry for that product needs to be deleted from the KTable with aggregated data using tombstone. If I don't do that the entry will never be deleted and will stay in the KTable. Is this correct? Thanks, Mihaela Stoycheva On Thu, Apr 19, 2018 at 3:12 PM, Matthias J

Is KTable cleaned up automatically in a Kafka streams application?

2018-04-19 Thread Mihaela Stoycheva
manually - clean up old state that is not longer needed? Regards, Mihaela Stoycheva

NotLeaderForPartitionException while restaring kafka brokers

2018-04-12 Thread Mihaela Stoycheva
messages and had to be restarted. In production there are 5 brokers and the replication factor is 3. The version of kafka streams that I use is 1.0.1 and the kafka version of the broker is 1.0.1. My question is if this is an expected behavior? Also is there any way to deal with it? Regards, Mihaela

Re: Question about Kafka Streams error message when a message is larger than the maximum size the server will accept

2018-03-28 Thread Mihaela Stoycheva
when you have caching enabled, the value of the record > has already been serialized before sending to the changelogger while the > key was not. Admittedly it is not very friendly for trouble-shooting > related log4j entries.. > > > Guozhang > > > On Tue, Mar 27

Question about Kafka Streams error message when a message is larger than the maximum size the server will accept

2018-03-27 Thread Mihaela Stoycheva
s JSON and the value logged as byte array instead of JSON? Regards, Mihaela Stoycheva