the Kafka Streams API/library is
> a normal, standard Java application -- you can of course also use any other
> Java/Scala/... library for the application's processing needs.
>
> -Michael
>
>
>
> On Tue, Mar 14, 2017 at 9:00 AM, BYEONG-GI KIM wrote:
>
> > Dear
Dear Michael Noll,
I have a question; Is it possible converting JSON format to YAML format via
using Kafka Streams?
Best Regards
KIM
2017-03-10 11:36 GMT+09:00 BYEONG-GI KIM :
> Thank you very much for the information!
>
>
> 2017-03-09 19:40 GMT+09:00 Michael Noll :
>
>&g
aka Kafka's trunk):
> https://github.com/confluentinc/examples/tree/3.2.x/kafka-streams#version-
> compatibility
>
> Hope this helps!
> Michael
>
>
>
>
> On Thu, Mar 9, 2017 at 9:59 AM, BYEONG-GI KIM wrote:
>
> > Hello.
> >
> > I'm a new wh
Hello.
I'm a new who started learning the one of the new Kafka functionality, aka
Kafka Stream.
As far as I know, the simplest usage of the Kafka Stream is to do something
like parsing, which forward incoming data from a topic to another topic,
with a few changing.
So... Here is what I'd want to
Hello.
I'm not sure whether it's a bug or not, but here is a something wrong;
I set 2 consumer apps that have same consumer group id, and partition has
been set to 1 on my Kafka Broker.
In theory, the messages on the Kafka Broker must be consumed either one,
which means it must not be consumed a
>
>
>
>
>
>
> On 11 July 2016 at 17:38, BYEONG-GI KIM wrote:
>
> > Hello.
> >
> > Generally, a Kafka Consumer consumes stored messages from Kafka Broker(s)
> > when the Consumer has been executed.
> >
> > I, however, want to create a func
Hello.
Generally, a Kafka Consumer consumes stored messages from Kafka Broker(s)
when the Consumer has been executed.
I, however, want to create a function that only consumes incoming messages
after executing, instead of consuming the previously stored messages as I
mentioned above, for real-time
Hello.
I wonder what the difference is between kafka_2.11 and kafka-client on
Maven Repo.
Thank you in advance!
Best regards
KIM
s access some threadsafe object if you need to combine the
> result.
> > In your linked example the consumers just each do there part, with solves
> > the multi-threaded issue, but when you want to combine data from
> different
> > consumer threads it becomes more tri
Hello.
I've implemented a Kafka Consumer Application which consume large number of
monitoring data from Kafka Broker and analyze those data accordingly.
I referred to a guide,
http://www.confluent.io/blog/tutorial-getting-started-with-the-new-apache-kafka-0.9-consumer-client,
since I thought the
examples/pageview
> <
> https://github.com/apache/kafka/tree/trunk/streams/examples/src/main/java/org/apache/kafka/streams/examples/pageview
> >
>
> It included JSON -> POJO -> JSON steps and could probably be adapted for
> your case?
>
> Regards
> To
Hello.
First I thank you so much for the devs since they've been making a great
tool as open-source software.
I'm considering to apply a new feature of the Kafka, aka Kafka Streams, on
my simple handler application, which receives monitoring data from Collectd
and reproduce transformed messages t
che-kafka-0.9-consumer-client
>
> It shows the basics of using the consumer, as well as a section where they
> launch 3 threads, each with one consumer, to consume a single topic.
>
> -James
>
> > On Mar 22, 2016, at 5:21 PM, BYEONG-GI KIM wrote:
> >
> > Hello.
Hello.
I'd like to know how to implement a multi-threaded consumer, which retrieve
message(s) from a topic per thread.
I read the Kafka Consumer 0.9.0.1 API document from
https://kafka.apache.org/090/javadoc/index.html?org/apache/kafka/clients/consumer/KafkaConsumer.html,
and I copied and pasted
on timeout period,
> the heartbeat mechanism will actually kick the consumer out and do a
> rebalance. I'd be curious to understand more about your usecase to see
> how/when you're experiencing overhead of the connection handshaking.
>
> On Mon, Mar 21, 2016 at 7:33 PM, BYEONG-GI
Hello. I have a question that the latest kafka, 0.9.0.1, provides any APIs
for managing connection pool of kafka on both consumer and producer sides.
I think the overhead which happens while establishing connection from
consumer/producer to kafka broker(s) seems a little heavy.
Thanks in advance!
; number of partitions.
>
>
> http://www.confluent.io/blog/how-to-choose-the-number-of-topicspartitions-in-a-kafka-cluster/
>
> -James
>
>
> > On Mar 1, 2016, at 4:11 PM, BYEONG-GI KIM wrote:
> >
> > Hello.
> >
> > I have questions about how many partitio
> partitions somewhat to not hit the parallelization limit.
>
> Cheers,
> Jens
>
> On Wed, Mar 2, 2016 at 1:11 AM, BYEONG-GI KIM wrote:
>
> > Hello.
> >
> > I have questions about how many partitions are optimal while using kafka.
> > As far as
Hello.
I have questions about how many partitions are optimal while using kafka.
As far as I know, even if there are multiple consumers that belong to a
consumer group, say *group_A*, only one consumer can receive a kafka
message produced by a producer if there is a partition. So, as a result,
mul
IP
of the VM in the property the source code worked very well from my host.
Anyway, thanks indeed for a lot of helps!
Best regards
Kim
2016-01-21 11:48 GMT+09:00 Steve Tian :
> Have you checked the firewall setting on vm/host?
>
>
> On Thu, Jan 21, 2016, 10:29 AM BYEONG-GI KIM
Kim
2016-01-20 18:14 GMT+09:00 Steve Tian :
> Your code works in my environment. Are you able to run your producer code
> inside your vm? You can also debug via changing the log level to
> DEGUG/TRACE.
>
> Cheers, Steve
>
>
> On Wed, Jan 20, 2016, 4:30 PM BYEONG-G
cer code
> inside your vm? You can also debug via changing the log level to
> DEGUG/TRACE.
>
> Cheers, Steve
>
>
> On Wed, Jan 20, 2016, 4:30 PM BYEONG-GI KIM wrote:
>
>> Sure, I started consumer before starting and sending messages from
>> producer, and my broker vers
messages accumulated
> in batch had been sent out.
>
> You can modify your producer as below to make it a sync call and test
> again.
>
> producer.send(new ProducerRecord("test",
> 0, Integer.toString(i), Integer.toString(i))).get();
>
> On Wed, 20 Jan 2016 at 16:31 BYEONG-
ve
>
> On Wed, Jan 20, 2016, 3:57 PM BYEONG-GI KIM wrote:
>
> > Hello.
> >
> > I set up the Kafka testbed environment on my VirtualBox, which simply
> has a
> > Kafka broker.
> >
> > I tested the simple consumer & producer scripts, aka
&g
Hello.
I set up the Kafka testbed environment on my VirtualBox, which simply has a
Kafka broker.
I tested the simple consumer & producer scripts, aka
kafka-console-consumer.sh and bin/kafka-console-producer.sh respectively,
and both of them worked fine. I could see the output from the consumer si
hink I set
topic/ip/port properly, at least.
Thanks in advance.
Best regards
KIM
2016-01-19 11:20 GMT+09:00 BYEONG-GI KIM :
> Hello. I'm a new who are getting started to learn Kafka.
>
> I'm now trying to develop a sample java source code to test the new Kafka
> Consumer, b
Hello. I'm a new who are getting started to learn Kafka.
I'm now trying to develop a sample java source code to test the new Kafka
Consumer, but it seems not working correctly. I've been testing both of
Kafka Consumer versions; the old one called as High Level Consumer and
another one called as a
27 matches
Mail list logo