You can refer this closed issue in github
<https://github.com/wurstmeister/kafka-docker/issues/100>.

On Thu, Sep 20, 2018 at 9:57 PM sarath reddy <putthasar...@gmail.com> wrote:

> ++pushkar
>
> On Thu 20 Sep, 2018, 16:20 sarath reddy, <putthasar...@gmail.com> wrote:
>
> > Hi Team,
> >
> > We are trying to configure Kafka to produce larger messages,
> >
> > Below are the configs:-
> >
> > Server.properties
> >
> > message.max.bytes=100000000
> > replica.fetch.max.bytes=100000001
> >
> > Producer.properties
> >
> > max.request.size=100000000
> > compression.type=gzip
> >
> > Consumer.properties
> >
> > fetch.message.max.bytes=100000000
> >
> > When trying to execute a larger file with below command
> > bin/kafka-console-producer.sh --broker-list localhost:9092 --topic
> > metadata_upload < /export/home/Userid/metadata/TopUBP-1_metadata.json
> >
> > Getting below error:-
> >
> > >[2018-09-19 11:20:03,307] ERROR Error when sending message to topic
> > metadata_upload with key: null, value: 2170060 bytes with error:
> > (org.apache.kafka.clients.producer.internals.ErrorLoggingCallback)
> > org.apache.kafka.common.errors.RecordTooLargeException: The message is
> > 2170148 bytes when serialized which is larger than the maximum request
> size
> > you have configured with the max.request.size configuration.
> >
> > Thanks,
> > Sarath Reddy.
> >
> >
> >
>


-- 
Thanks and Regards,
Subash Konar

Reply via email to