Hi Sa,
I created bulk consumer which consumes, processes and post to ElasticSearch.
There are config for the size of message consumption. And you can modify
the code about what you want to do about the consumed message.
https://github.com/reachkrishnaraj/kafka-elasticsearch-standalone-consumer
in Web Server which is a
significant effort.
I totally understand that having a layer like Logging service in between
Kafka and the Application will totally defect the purpose for Kafka.
And I would love to know your advice how best to handle these type of
maintenance.
Thanks,
Krishna Raj
On
hout affecting my large scale Production system. Its not an
easy thing to buy a window to these type of changes done on a large scale
production application :)
Any advice on how this can be achieved(even moderately) will greatly help ?
Thanks,
Krishna Raj
gt; On Mon, Mar 24, 2014 at 10:43 PM, Krishna Raj >wrote:
>
> > Hi experts & Kafka Dev team,
> >
> > Have a very quick question and need your help in designing a consumer. I
> am
> > trying to keep the email short and simple.
> >
> > Scenario, Lets
?*
Thanks for your time !
Krishna Raj
*
Thanks for your time !
Krishna Raj
nsumer
> does not yet exist so you definitely can't use that. :-)
>
> Hope that helps!
>
> Cheers,
>
> -Jay
>
> On Wed, Mar 19, 2014 at 2:33 AM, Krishna Raj
>
> > wrote:
>
>> Hello Experts & Kafka Team,
>>
>> Its existing to learn
Hello Experts & Kafka Team,
Its existing to learn and work on using Kafka. I have been going through
lot of pages and Q&A.
We are building an infra & topology using Kafka for events processing in
our application.
We need some advice about designing the Producer and Consumer.
*Please find attach
log.
>
> kafka.common.OffsetOutOfRangeException: Request for offset 1318 but we only
> have log segments in the range 0 to 2.
>
> If you run console consumer on that topic (using --from-beginning), how
> many messages do you see?
>
> Thanks,
>
> Jun
>
>
> On Mon, Fe
Hi there,
I am a newbie to Kafka. I am trying to use the (
https://github.com/endgameinc/elasticsearch-river-kafka) plugin to pull the
messages from Kafka.
When I start the ElasticSearch, the 1st message gets pulled into the
cluster. And after no messages are pulled even there are enough messages
Hi All,
I have banged my head for nearly 3 hours trying to find out why my producer
code did not enqueue the message. And later found a very simple issue.
I dont want anyone else to waste their time in this and hence this email.
1) Broker is in a physical Linux box. The broker was registered usi
11 matches
Mail list logo