Why?? Kafka High Level Consumer is 0.8.2.2 Stop Responding After Few Hours... Timeout in Broker Log

2017-08-01 Thread Rachana Srivastava
:54) at kafka.network.Processor.read(SocketServer.scala:445) at kafka.network.Processor.run(SocketServer.scala:341) at java.lang.Thread.run(Thread.java:745) I have written a simple Kafka High Level consumer. I have not specified any value for the

How to tune Kafka High Level Consumer for Production

2017-07-24 Thread Rachana Srivastava
I want to know how to tune/setup high level Kafka Client to a Kafka server in EC2 I set zookeeper.session.timeout.ms=5. I found that after some time I got following error in the logs. I want to know how to tune Kafka parameters to run the consumer for ever. I checked and found ZK is running

Kafka High Level Consumer OOME

2016-08-01 Thread 张学文
Hi Our kafka consumer application has been running for a week without any problems. But I face to OOME while trying to consume from one topic 100 partitions by 100 consumers today. The configurations for the consumers are there: zookeeper.session.timeout.ms = 1 zookeeper.sync.time.ms = 200 au

Re: Kafka High Level Consumer Message Loss?

2015-07-12 Thread Mayuresh Gharat
Can you confirm that you are not actually seeing the messages on the lagging broker ? Because if the Max Lag is 0 it should mean that consumer has read the offsets till log end offset of the broker. Thanks, Mayuresh On Fri, Jul 10, 2015 at 8:29 PM, Allen Wang wrote: > We have two applications

Kafka High Level Consumer Message Loss?

2015-07-10 Thread Allen Wang
We have two applications that consume all messages from one Kafka cluster. We found that the MessagesPerSec metric started to diverge after some time. One of them matches the MessagesInPerSec metric from the Kafka broker, while the other is lower than the broker metric and appears to have some mess

Getting Kafka High Level Consumer Id

2015-05-25 Thread Rahul Amaram
Consider the high level consumer example https://cwiki.apache.org/confluence/display/KAFKA/Consumer+Group+Example. Using JAVA API, Is it possible to fetch the consumer id for this particular consumer as displayed in ConsumerOffsetCheck "Owner" field? Thanks, Rahul.

Re: Kafka High Level Consumer OOME

2015-03-17 Thread Guozhang Wang
Hello Dima, The current consumer does not have explicit memory control mechanism, but you can try to indirectly bound the memory usage via the following configs: fetch.message.max.bytes and queued.max.message.chunks. Details can be found at http://kafka.apache.org/documentation.html#consumerconfig

Kafka High Level Consumer OOME

2015-03-17 Thread Dima Dimas
Hi I face to OOME while trying to consume from one topic 10 partitions (100 000 messages each partition) 5 consumers(consumer groups), consumer.timeout=10ms. OOME was gotten after 1-2 minutes after start. Java heap - Xms=1024M LAN about 10Gbit This is standalone application. Kafka version 0.8.2

Re: Kafka High Level Consumer

2015-02-27 Thread Pranay Agarwal
Rs to make it better. > > -Joe Lawson > > > From: Pranay Agarwal > Sent: Wednesday, February 25, 2015 1:45 AM > To: users@kafka.apache.org > Subject: Re: Kafka High Level Consumer > > Thanks Jun. It seems it was an issue with jruby client I was using. Now, > they fix

Re: Kafka High Level Consumer

2015-02-25 Thread Joseph Lawson
ubject: Re: Kafka High Level Consumer Thanks Jun. It seems it was an issue with jruby client I was using. Now, they fixed it. -Pranay On Mon, Feb 23, 2015 at 4:57 PM, Jun Rao wrote: > Did you enable auto offset commit? > > Thanks, > > Jun > > On Tue, Feb 17, 2015 at 4:22 PM, Pr

Re: Kafka High Level Consumer

2015-02-24 Thread Pranay Agarwal
Thanks Jun. It seems it was an issue with jruby client I was using. Now, they fixed it. -Pranay On Mon, Feb 23, 2015 at 4:57 PM, Jun Rao wrote: > Did you enable auto offset commit? > > Thanks, > > Jun > > On Tue, Feb 17, 2015 at 4:22 PM, Pranay Agarwal > wrote: > > > Hi, > > > > I am trying to

Re: Kafka High Level Consumer

2015-02-23 Thread Jun Rao
Did you enable auto offset commit? Thanks, Jun On Tue, Feb 17, 2015 at 4:22 PM, Pranay Agarwal wrote: > Hi, > > I am trying to read kafka consumer using high level kafka Consumer API. I > had to restart the consumers for some reason but I kept the same group id. > It seems the consumers have s

Kafka High Level Consumer

2015-02-17 Thread Pranay Agarwal
Hi, I am trying to read kafka consumer using high level kafka Consumer API. I had to restart the consumers for some reason but I kept the same group id. It seems the consumers have started consuming from the beginning (0 offset) instead from the point they had already consumed. What am I doing wr

Re: Kafka High Level Consumer Connector shuts down after 10 seconds

2014-11-26 Thread Davide Brambilla
Hi, you are programatically shutting down the executor after 10 seconds try { Thread.sleep(1); } catch (InterruptedException ie) { } example.shutdown(); if you do not execute this code your threads will run forever. Davide B. --

Re: Kafka High Level Consumer

2014-09-15 Thread Rahul Mittal
Thanks Joe Stein This worked :) On Fri, Sep 12, 2014 at 3:19 PM, Rahul Mittal wrote: > Hi , > Is there a way in kafka to read data from all topics, from a consumer > group without specifying topics in a dynamic way. > That is if new topics are created on kafka brokers the consumer group > should

Re: Kafka High Level Consumer

2014-09-12 Thread Joe Stein
You want to use the createMessageStreamsByFilter and pass in a WhiteList with a regex that would include everything you want... here is e.g. how to use that https://github.com/apache/kafka/blob/0.8.1/core/src/main/scala/kafka/consumer/ConsoleConsumer.scala#L196 /**

Kafka High Level Consumer

2014-09-12 Thread Rahul Mittal
Hi , Is there a way in kafka to read data from all topics, from a consumer group without specifying topics in a dynamic way. That is if new topics are created on kafka brokers the consumer group should figure it out and start reading from the new topic as well without explicitly defining new topic

Re: kafka high level consumer - threads guaranteed to read a single partition?

2014-08-19 Thread Guozhang Wang
Hi Josh, Yes. The comsumption distribution at Kafka is at the granularity of partitions, i.e. each partition will only be consumed by one consumer within the group. Guozhang On Tue, Aug 19, 2014 at 2:01 AM, Josh J wrote: > Hi, > > For the kafka high level consumer, if I create ex

kafka high level consumer - threads guaranteed to read a single partition?

2014-08-19 Thread Josh J
Hi, For the kafka high level consumer, if I create exactly the number of threads as the number of partitions, is there a guarantee that each thread will be the only thread that reads from a particular partition? I'm following this example <https://github.com/bingoohuang/java-sand

Re: Kafka High Level Consumer Fail Over

2014-06-13 Thread Bhavesh Mistry
We have 3 node cluster separate physical box for consumer group and consumer that died "mupd_logmon_hb_ events_sdc-q1-logstream-8-1402448850475-6521f70a". On the box, I show the above Exception. What can I configure such way, that when a partition in COnsumer Group does not have "Owner" other

Re: Kafka High Level Consumer Fail Over

2014-06-13 Thread Guozhang Wang
>From which consumer instance did you see these exceptions? Guozhang On Thu, Jun 12, 2014 at 4:39 PM, Bhavesh Mistry wrote: > Hi Kafka Dev Team/ Users, > > We have high level consumer group consuming from 32 partitions for a > topic. We have been running 48 consumers in this group across mu

Kafka High Level Consumer Fail Over

2014-06-12 Thread Bhavesh Mistry
Hi Kafka Dev Team/ Users, We have high level consumer group consuming from 32 partitions for a topic. We have been running 48 consumers in this group across multiple servers. We have kept 16 as back-up consumers, and hoping when the consumer dies, meaning when Zookeeper does not have an owner

Re: Kafka High Level Consumer Connector shuts down after 10 seconds

2014-03-24 Thread Jun Rao
ms = > consumerMap.get(topic);* > > > > *// now launch all the threads* > > *//* > > *executor = > Executors.newFixedThreadPool(a_numThreads);* > > > > *// now create an object to consume the messages* > > *

Re: Kafka High Level Consumer Connector shuts down after 10 seconds

2014-03-24 Thread Guozhang Wang
// now launch all the threads* > > *//* > > *executor = > Executors.newFixedThreadPool(a_numThreads);* > > > > *// now create an object to consume the messages* > > *//* >

Re: Kafka High Level Consumer Connector shuts down after 10 seconds

2014-03-10 Thread Neha Narkhede
Session termination can happen either when client or zookeeper process pauses (due to GC) or when the client process terminates. A sustainable solution is to tune GC settings. For now, you can try increasing the zookeeper.session.timeout.ms. On Sun, Mar 9, 2014 at 3:44 PM, Ameya Bhagat wrote:

Kafka High Level Consumer Connector shuts down after 10 seconds

2014-03-09 Thread Ameya Bhagat
I am using a high level consumer as described at: https://cwiki.apache.org/confluence/display/KAFKA/Consumer+Group+Example I am noticing that my consumer does not run forever and ends after some time (< 15s). At the zookeeper side, I see the following: INFO Processed session termination for sessi

Re: Kafka High Level Consumer Fetch All Messages From Topic Using Java API (Equivalent to --from-beginning)

2014-02-14 Thread pushkar priyadarshi
ck to either most current or oldest message offset. But other's more experienced opinion on this will be great. Regards, Pushkar On Feb 14, 2014 4:40 PM, wrote: > Good Morning, > > I am testing the Kafka High Level Consumer using the ConsumerGroupExample > code from the Kafka

Kafka High Level Consumer Fetch All Messages From Topic Using Java API (Equivalent to --from-beginning)

2014-02-14 Thread jpania
Good Morning, I am testing the Kafka High Level Consumer using the ConsumerGroupExample code from the Kafka site. I would like to retrieve all the existing messages on the topic called "test" that I have in the Kafka server config. Looking at other blogs, auto.offset.reset should