Ah ok.  Another dumb question: what about acks?  Are you using auto-ack? 

On 7/19/16, 10:00 AM, "Abhinav Solan" <abhinav.so...@gmail.com> wrote:

    If I add 2 more nodes and make it a cluster .. would that help ? Have
    searched forums and all this kind of thing is not there ... If we have a
    cluster then might be Kafka Server has a backup option and it self heals
    from this behavior ... Just a theory
    
    On Tue, Jul 19, 2016, 7:57 AM Abhinav Solan <abhinav.so...@gmail.com> wrote:
    
    > No, was monitoring the app at that time .. it was just sitting idle
    >
    > On Tue, Jul 19, 2016, 7:32 AM David Garcia <dav...@spiceworks.com> wrote:
    >
    >> Is it possible that your app is thrashing (i.e. FullGC’ing too much and
    >> not processing messages)?
    >>
    >> -David
    >>
    >> On 7/19/16, 9:16 AM, "Abhinav Solan" <abhinav.so...@gmail.com> wrote:
    >>
    >>     Hi Everyone, can anyone help me on this
    >>
    >>     Thanks,
    >>     Abhinav
    >>
    >>     On Mon, Jul 18, 2016, 6:19 PM Abhinav Solan <abhinav.so...@gmail.com>
    >> wrote:
    >>
    >>     > Hi Everyone,
    >>     >
    >>     > Here are my settings
    >>     > Using Kafka 0.9.0.1, 1 instance (as we are testing things on a
    >> staging
    >>     > environment)
    >>     > Subscribing to 4 topics from a single Consumer application with 4
    >> threads
    >>     >
    >>     > Now the server keeps on working fine for a while, then after about
    >> 3-4 hrs
    >>     > or so, it stops consuming at all.
    >>     > I started my own Consumer Instance and one Kafka Console Consumer,
    >> I can
    >>     > see messages coming in the console consumer but not in my Consumer
    >> instance.
    >>     >
    >>     > There are some messages which are coming through but not all
    >> messages,
    >>     > then after a while I restarted the Consumer instance, then again
    >> nothing
    >>     > coming through .. then I restarted the Kafka Server and then I
    >> could see
    >>     > all the messages coming through.
    >>     >
    >>     > Has anyone seen this kind of problem ?
    >>     > Is it because I am running only single broker ?
    >>     >
    >>     > Here are the properties I am setting for the Consumer -
    >>     > fetch.min.bytes=1
    >>     > max.partition.fetch.bytes=8192
    >>     > heartbeat.interval.ms=10000
    >>     >
    >>     > I have written up my Consumer using
    >>     > http://docs.confluent.io/2.0.1/clients/consumer.html
    >>     >
    >>     > Thanks,
    >>     > Abhinav
    >>     >
    >>     >
    >>
    >>
    >>
    

Reply via email to