Rs to make it better.
>
> -Joe Lawson
>
> ________
> From: Pranay Agarwal
> Sent: Wednesday, February 25, 2015 1:45 AM
> To: users@kafka.apache.org
> Subject: Re: Kafka High Level Consumer
>
> Thanks Jun. It seems it was an issue with jruby client I was using. Now,
> they fix
Thanks Jun. It seems it was an issue with jruby client I was using. Now,
they fixed it.
-Pranay
On Mon, Feb 23, 2015 at 4:57 PM, Jun Rao wrote:
> Did you enable auto offset commit?
>
> Thanks,
>
> Jun
>
> On Tue, Feb 17, 2015 at 4:22 PM, Pranay Agarwal
> wrote:
>
Hi,
I am trying to read kafka consumer using high level kafka Consumer API. I
had to restart the consumers for some reason but I kept the same group id.
It seems the consumers have started consuming from the beginning (0 offset)
instead from the point they had already consumed.
What am I doing wr
s, you are limited
> to 8 per server (probably less because there are other stuff on the
> server).
>
> Gwen
>
> On Mon, Jan 19, 2015 at 3:06 PM, Pranay Agarwal
> wrote:
> > Thanks a lot Natty.
> >
> > I am using this Ruby gem on the client side with all the
t;
> Jonathan "Natty" Natkins
> StreamSets | Customer Engagement Engineer
> mobile: 609.577.1600 | linkedin <http://www.linkedin.com/in/nattyice>
>
>
> On Mon, Jan 19, 2015 at 2:34 PM, Pranay Agarwal
> wrote:
>
> > Thanks Natty.
> >
> > Is there
on your brokers or to decrease your max fetch size.
>
> Thanks,
> Natty
>
> Jonathan "Natty" Natkins
> StreamSets | Customer Engagement Engineer
> mobile: 609.577.1600 | linkedin <http://www.linkedin.com/in/nattyice>
>
>
> On Mon, Jan 19, 20
Hi All,
I have a kafka cluster setup which has 2 topics
topic1 with 10 partitions
topic2 with 1000 partitions.
While, I am able to consume messages from topic1 just fine, I get following
error from the topic2. There is a resolved issue here on the same thing
https://issues.apache.org/jira/browse
Please subscribe myself.