Hello All -
I'm getting this error, when publishing messages to Kafka topic using SSL
mode,

Command to publish messages :

*/usr/hdp/2.5.3.0-37/kafka/bin/kafka-producer-perf-test.sh --messages
1000000 --message-size 1000 --topics mmtest4 \*
*--broker-list <host1>:9093,<host2>:9093,<host3>:9093, \*
*--threads 1 --compression-codec 3 --batch-size 10000 \*
*--security-protocol SSL --show-detailed-stats*



*[2017-12-21 19:48:49,846] WARN Fetching topic metadata with correlation id
11 for topics [Set(mmtest4)] from broker [BrokerEndPoint(0,<host1>,9093)]
failed (kafka.client.ClientUtils$)*
*java.io.EOFException*
* at
org.apache.kafka.common.network.NetworkReceive.readFromReadableChannel(NetworkReceive.java:99)*
* at
kafka.network.BlockingChannel.readCompletely(BlockingChannel.scala:140)*
* at kafka.network.BlockingChannel.receive(BlockingChannel.scala:131)*
* at kafka.producer.SyncProducer.liftedTree1$1(SyncProducer.scala:84)*
* at
kafka.producer.SyncProducer.kafka$producer$SyncProducer$$doSend(SyncProducer.scala:81)*
* at kafka.producer.SyncProducer.send(SyncProducer.scala:126)*
* at kafka.client.ClientUtils$.fetchTopicMetadata(ClientUtils.scala:59)*
* at
kafka.producer.BrokerPartitionInfo.updateInfo(BrokerPartitionInfo.scala:83)*
* at
kafka.producer.BrokerPartitionInfo.getBrokerPartitionInfo(BrokerPartitionInfo.scala:50)*
* at
kafka.producer.async.DefaultEventHandler.kafka$producer$async$DefaultEventHandler$$getPartitionListForTopic(DefaultEventHandler.scala:206)*
* at
kafka.producer.async.DefaultEventHandler$$anonfun$partitionAndCollate$1.apply(DefaultEventHandler.scala:170)*
* at
kafka.producer.async.DefaultEventHandler$$anonfun$partitionAndCollate$1.apply(DefaultEventHandler.scala:169)*
* at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)*
* at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)*
* at
kafka.producer.async.DefaultEventHandler.partitionAndCollate(DefaultEventHandler.scala:169)*
* at
kafka.producer.async.DefaultEventHandler.dispatchSerializedData(DefaultEventHandler.scala:101)*
* at
kafka.producer.async.DefaultEventHandler.handle(DefaultEventHandler.scala:78)*
* at
kafka.producer.async.ProducerSendThread.tryToHandle(ProducerSendThread.scala:106)*
* at
kafka.producer.async.ProducerSendThread$$anonfun$processEvents$3.apply(ProducerSendThread.scala:89)*
* at
kafka.producer.async.ProducerSendThread$$anonfun$processEvents$3.apply(ProducerSendThread.scala:69)*
* at scala.collection.Iterator$class.foreach(Iterator.scala:727)*
* at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)*
* at
kafka.producer.async.ProducerSendThread.processEvents(ProducerSendThread.scala:68)*
* at
kafka.producer.async.ProducerSendThread.run(ProducerSendThread.scala:46)*
Seems SSL is setup correctly, when i run the following command, i get
expected result ->
*openssl s_client -debug -connect host1:9093 -tls1*

Setting done in the server.properties (using Ambari) ->
*listeners=PLAINTEXT://localhost:9092,SSL://localhost:9093*
* advertised.listeners=PLAINTEXT://localhost:9092,SSL://localhost:9093*
* ssl.truststore.location=/etc/kafka/truststore/kafka.server.truststore.jks*
* ssl.truststore.password=password*
* ssl.keystore.location=/etc/kafka/truststore//kafka.server.keystore.jks*
* ssl.keystore.password=password*
* ssl.key.password=password*
These setting seem to be working on another environment, however - on this
env (Prod),
it seem to be giving the error shown above.
Any ideas on what need to be done to debug/fix the error ?

Reply via email to