Hi Users, Let me know if any one faced this issue.
I have went through multiple articles but has different answers. Just want to check with kafka users. Below are the setting i have on kafka cluster. What are the tuning parameters to overcome this large message size issue. Kafka version: 0.11 Number of nodes in a kafka cluster: 3 nodes Number topic and partitions: 1topic and 10 partitions. Message size: upto 5mb Max.messages.bytes on topic is 2mb Error message: 201904-09 00:00:02.469 ERROR 35301 --- [ad | producer-1] c.b.a.s.p.KafkaTelemetryConsumer : Failed to send TelemetryHarvesterServer with data size 1090517 to kafka. org.springframework.kafka.core.KafkaProducerException: Failed to send; nested exception is org.apache.kafka.common.errors.RecordTooLargeException: The request included a message larger than the max message size the server will accept. ```