Hello!
  We are using Kafka 0.9.1. We have created a class CustomKafkaConsumer
whose method receive
has the pseudocoode

    public OurClassStructure[] receiveFromKafka()
     {
//gte the message from topic
           ConsumerRecords<Integer, V> received= org
<eclipse-javadoc:%E2%98%82=ccKafka/D:%5C/tools%5C/gradle%5C/repository%5C/.gradle%5C/caches%5C/modules-2%5C/files-2.1%5C/org.apache.kafka%5C/kafka-clients%5C/0.9.0.0%5C/8c85b7de6c61ceae39b2a3d4fa055b0850325c14%5C/kafka-clients-0.9.0.0.jar%3Corg>
.apache
<eclipse-javadoc:%E2%98%82=ccKafka/D:%5C/tools%5C/gradle%5C/repository%5C/.gradle%5C/caches%5C/modules-2%5C/files-2.1%5C/org.apache.kafka%5C/kafka-clients%5C/0.9.0.0%5C/8c85b7de6c61ceae39b2a3d4fa055b0850325c14%5C/kafka-clients-0.9.0.0.jar%3Corg.apache>
.kafka
<eclipse-javadoc:%E2%98%82=ccKafka/D:%5C/tools%5C/gradle%5C/repository%5C/.gradle%5C/caches%5C/modules-2%5C/files-2.1%5C/org.apache.kafka%5C/kafka-clients%5C/0.9.0.0%5C/8c85b7de6c61ceae39b2a3d4fa055b0850325c14%5C/kafka-clients-0.9.0.0.jar%3Corg.apache.kafka>
.clients
<eclipse-javadoc:%E2%98%82=ccKafka/D:%5C/tools%5C/gradle%5C/repository%5C/.gradle%5C/caches%5C/modules-2%5C/files-2.1%5C/org.apache.kafka%5C/kafka-clients%5C/0.9.0.0%5C/8c85b7de6c61ceae39b2a3d4fa055b0850325c14%5C/kafka-clients-0.9.0.0.jar%3Corg.apache.kafka.clients>
.consumer
<eclipse-javadoc:%E2%98%82=ccKafka/D:%5C/tools%5C/gradle%5C/repository%5C/.gradle%5C/caches%5C/modules-2%5C/files-2.1%5C/org.apache.kafka%5C/kafka-clients%5C/0.9.0.0%5C/8c85b7de6c61ceae39b2a3d4fa055b0850325c14%5C/kafka-clients-0.9.0.0.jar%3Corg.apache.kafka.clients.consumer>
.KafkaConsumer.poll(timeout);


          //transform received to our  OurClassStructure[]
      }


we would like use this CustomKafkaConsumer due to our needs:
1. get the kafka message from a long lived thread (such as Storm spout in
the nextTuple method)
2.  get the kafka messages from a Java scheduled thread executor at one
second frequency


ScheduledThreadPoolExecutor.scheduleAtFixedRate(
()->{
OurClassStructure[]  km=CustomKafkaConsumer.receiveFromKafka();
//doSomething with km
},
 long initialDelay,1,TimeUnit.SECOND)


I've seen that the recommended way (in the book Kafka definitive guide
chapter 4 and also in this post
http://www.confluent.io/blog/tutorial-getting-started-with-the-new-apache-kafka-0.9-consumer-client)
to consume from Kafka is to have an infinite loop and do the pool the
messages from that loop.

My questions are:

1. Is our approach with the scheduling consumption a good one?
2.If not what are the caveats/gotchas of our approach?
3. Should we adhere to the recommended way to consume the message from
Kafka?

I look forward for your answers.
Regards,
 Florin

Reply via email to