Hi Jakob,
The consumer is not consuming the messages because the messages might have
already been consumed using the consumer group, jupiter-workers. So, the
options below will help you read the messages.
1. Change the group.id from jupiter-workers to something else like
jupiter-workers-latest
2.
Hi Ola,
I would suggest you can go with single Topic with multiple partitions. Once
the data gets received from the Topic, you can do a DB update kind of a
stuff to store the data , then use the data for analysing.
Also, the below URL can be used to do the Topic sizing.
eventsizer.io
Thanks
C
to the same partition, hence the suggestion to re-key mentioned above.
>
> It's not clear to me if you require only the "finalized" message be emitted
> downstream. In either case you could filter emitted messages so only the
> one with all the keys is passed to downstream o
Hi,
Greetings!
My requirement is as below.
I have Topic named "sample-topic". This topic contains the same keys(as
String) with multiple messages and the messages are in JSON format. I would
like to merge the JSON messages and produce a final JSON. In order to
achieve this, how to maintain the
ta load from
> what?
>
> Thanks,
>
> Liam Clarke-Hutchinson
>
> On Sun, 31 May 2020, 4:50 pm Suresh Chidambaram,
> wrote:
>
> > Hi Team,
> >
> > Could someone help me with my request below?
> >
> > Thanks
> > C Suresh
> >
> &g
Hi Team,
Could someone help me with my request below?
Thanks
C Suresh
On Saturday, May 30, 2020, Suresh Chidambaram
wrote:
> Hi Team,
>
> My requirement is that I have to perform an One Time Data Load(intial
> load) to the topic, later I have to perform the delta load to the topic
Hi Team,
My requirement is that I have to perform an One Time Data Load(intial load)
to the topic, later I have to perform the delta load to the topic. So could
someone guide me how to achieve this requirement in Confluent Kafka/Apache
Kafka?
Thanks
C Suresh
Hi All,
Currently, I'm working on a usecase wherein I have to deserialie an Avro
object and convert to some other format of Avro. Below is the flow.
DB -> Source Topic(Avro format) -> Stream Processor -> Target Topic (Avro
as nested object).
When I deserialize the message from the Source Topic,
-kafka
>
> On Tue, Apr 28, 2020 at 11:12 AM Suresh Chidambaram <
> chida.sur...@gmail.com>
> wrote:
>
> > Sure Khaja.
> >
> > Thanks
> > C Suresh
> >
> > On Tuesday, April 28, 2020, KhajaAsmath Mohammed <
> mdkhajaasm...@gmail.com>
&g
>> Hello Suresh,
>>
>> I am also looking for the same. Let me know if you find anything
>>
>> Sent from my iPhone
>>
>> On Apr 28, 2020, at 8:25 PM, Suresh Chidambaram
>>> wrote:
>>>
>>> Hi Team,
>>>
>>> Greet
Sure Khaja.
Thanks
C Suresh
On Tuesday, April 28, 2020, KhajaAsmath Mohammed
wrote:
> Hello Suresh,
>
> I am also looking for the same. Let me know if you find anything
>
> Sent from my iPhone
>
> > On Apr 28, 2020, at 8:25 PM, Suresh Chidambaram
> wrote:
> >
Hi Team,
Greetings.
I have been looking for an example application which uses Kafka Streams
with Spring Boot, but I'm unable to find one in the internet. Could
someone help me by providing the code?
Thanks
C Suresh
Hi Val,
Could you share the server.properties and zookeeper.properties?
Thanks
C Suresh
On Tuesday, April 28, 2020, Valentin Kulichenko <
valentin.kuliche...@gmail.com> wrote:
> Greetings to the Kafka Community!
>
> I'm a newbie in Kafka and only recently went beyond a local installation
> des
>
> Hope that helps,
>
> Liam Clarke-Hutchinson
>
> On Thu, 23 Apr. 2020, 4:41 am Suresh Chidambaram,
> wrote:
>
> > Hi Team,
> >
> > Greetings.
> >
> > I have a use-case wherein I have to consume messages from multiple topics
>
Hi Team,
Greetings.
I have a use-case wherein I have to consume messages from multiple topics
using Kafka and process it using Kafka Streams, then publish the message
to multiple target topics.
The example is below.
Source topic A - process A - target topic A
Source topic B - process B - targe
15 matches
Mail list logo