Hi everyone,

I have a question that is not really related to Flink but maybe someone of you 
can help me understanding what I am doing wrong.

I have a Flink job that processes events generated by a Java application. The 
output of the Flink job is emitted on Kafka; the Java application runs a Kafka 
consumer to receive the results computed by Flink. On my local setup everything 
works fine. I deployed such application on a cluster environment using a 
dedicated Kafka cluster. The Java app runs separately on a machine that have 
access to the Kafka cluster but it does not consume any Kafka message. However, 
if I run the Kafka console consumer on the same machine I can see correctly the 
messages. 

I execute the Java Kafka consumer with the following parameters:
properties.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, 
StringDeserializer.class.getName());
properties.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, 
StringDeserializer.class.getName());

properties.put(ConsumerConfig.GROUP_ID_CONFIG, groupIdentifier);

properties.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "latest");
properties.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
Do you have any idea of what I am possibly doing wrong? 

Thank you so much.

Best,


Gabriele

Reply via email to