Questions about upgrading to Kafka 1.0 from 0.10.0

2017-11-21 Thread Anish Mashankar
Hello Kafka users! The first question that I have is related to the documentation. I see that we no longer have to change the message format version when upgrading to 1.0. So, will all clients continue to work after performing the rolling upgrade? We are running Kafka v0.10.0.0. The Kafka ecosystem

org.apache.kafka.common.errors.TimeoutException: Failed to send request after 5000 ms

2017-11-21 Thread Abhimanyu Nagrath
I am using Kafka V 0.10.2 (8 Cores, 16 GB RAM). While executing the command > ./kafka-consumer-groups.sh --bootstrap-server localhost:9092 > --describe --group Sometimes I am getting this exception : Exception in consumer group command (kafka.admin.ConsumerGroupCommand$) java.lang.Runti

Kafka ERROR after upgrade to Kafka 1.0.0 version : java.lang.OutOfMemoryError: Java heap space

2017-11-21 Thread Vinay Kumar
Hi, I upgraded Kafka from 0.10.2.1 to 1.0.0 version. And only from then, I'm seeing the kafka service going down because of below issue: ERROR [KafkaApi-1] Error when handling request ; (kafka.server.KafkaApis) {replica_id=-1,max_wait_time=500,min_bytes=1,topics=[{topic= ,partitions=[{partiti

Re: Questions about upgrading to Kafka 1.0 from 0.10.0

2017-11-21 Thread Ismael Juma
Hi Anish, The documentation is a bit misleading, see the following JIRA: https://issues.apache.org/jira/browse/KAFKA-6238 All of your clients will still work after the upgrade, but there is an efficiency hit if the message format used for the topic is newer than the message format supported by t

Re: Kafka ERROR after upgrade to Kafka 1.0.0 version : java.lang.OutOfMemoryError: Java heap space

2017-11-21 Thread Ismael Juma
Hi Vinay, The issue you describe looks like KAFKA-6185. You can either build the latest code in the 1.0 branch (which includes the fix) or downgrade to 0.11.0.2. Regards, Ismael On Tue, Nov 21, 2017 at 2:01 PM, Vinay Kumar wrote: > Hi, > > I upgraded Kafka from 0.10.2.1 to 1.0.0 version. And o

Kafka consumer query.

2017-11-21 Thread sumit singhal
Hi Team, My Kafka consumer has 2 threads and the number of partitions is let's say 10 so overall 5 partitions per consumer thread. I am saving the time at which a particular record needs to be processed. Now if record1 on partition1 needs to be picked 10 hours from now Thread should move to next p

Time-Based Index for Consuming Messages Up to Certain Timestamp

2017-11-21 Thread Ray Ruvinskiy
I’ve been reading https://cwiki.apache.org/confluence/display/KAFKA/KIP-33+-+Add+a+time+based+log+index and trying to determine whether I can use the time-based index as an efficient way to sort a stream of messages into timestamp (CreateTime) order. I am dealing with a number of sources emitti

Streams and Windows

2017-11-21 Thread Puneet Lakhina
Hello, Im new to Kafka ecosystem so I apologize if this is all a naive question. What Im looking to accomplish is the following: - We get heartbeat events from a source which is piped into a kafka topic. These events are of the form - {"session_id": "foo", "total_time_spent": x } . We get these

Re: Time-Based Index for Consuming Messages Up to Certain Timestamp

2017-11-21 Thread Matthias J. Sax
This is possible, but I think you don't need the time-based index for it :) You will just buffer up all messages for a 5 minute sliding-window and maintain all message sorted by timestamp in this window. Each time the window "moves" you write the oldest records that "drop out" of the window to the

Re: Streams and Windows

2017-11-21 Thread Matthias J. Sax
You need to disable KTable cache to get every update by setting caches size to zero: https://docs.confluent.io/current/streams/developer-guide/memory-mgmt.html -Matthias On 11/21/17 2:14 PM, Puneet Lakhina wrote: > Hello, > > Im new to Kafka ecosystem so I apologize if this is all a naive que

Re: Time-Based Index for Consuming Messages Up to Certain Timestamp

2017-11-21 Thread Ted Yu
bq. an older timestamp that allowed I guess you meant 'than allowed' Cheers On Tue, Nov 21, 2017 at 2:57 PM, Matthias J. Sax wrote: > This is possible, but I think you don't need the time-based index for it :) > > You will just buffer up all messages for a 5 minute sliding-window and > maintai

Re: Time-Based Index for Consuming Messages Up to Certain Timestamp

2017-11-21 Thread Ray Ruvinskiy
Thanks for your reply! I am quite inexperienced when it comes to Kafka and Kafka Streams and so would appreciate a little more guidance. How would one keep messages within a sliding window sorted by timestamp? Would the sort operation be done all in memory? I would be dealing potentially with hu

Re: Time-Based Index for Consuming Messages Up to Certain Timestamp

2017-11-21 Thread Matthias J. Sax
Using Kafka Streams, it seems reasonable to implement this using low-level Processor API with a custom state store. Thus, you use the `StateStore` interface to implement you state store -- this allows you to spill to disk if you need to to handle state larger than main memory. If you want to brow