Re: RecordTooLargeException with old (0.10.0.0) consumer

2020-07-28 Thread Thomas Becker
Again, we haven't changed the default message size, I believe this exception is a red herring. On Tue, 2020-07-28 at 17:38 +, manoj.agraw...@cognizant.com wrote: [EXTERNAL EMAIL] Attention: This email was sent from outside Xperi. DO NOT CLICK any links or attachments unless you expected the

Re: RecordTooLargeException with old (0.10.0.0) consumer

2020-07-28 Thread Manoj.Agrawal2
Hi , You also make to change at producer and consumer side as well server.properties: message.max.bytes=15728640 replica.fetch.max.bytes=15728640 max.request.size=15728640 fetch.message.max.bytes=15728640 and producer.properties: max.request.size=15728640 consumer max.partition.fetch.

RecordTooLargeException with old (0.10.0.0) consumer

2020-07-28 Thread Thomas Becker
We have some legacy applications using an old (0.10.0.0) version of the consumer that are hitting RecordTooLargeExceptions with the following message: org.apache.kafka.common.errors.RecordTooLargeException: There are some messages at [Partition=Offset]: {mytopic-0=13920987} whose size is larger

Re: RecordTooLargeException on 16M messages in Kafka?

2019-08-16 Thread Jonathan Santilli
Ok, it means that, using the KafkaProducer class setting the config max.request.size worked correctly, right? Now you are wondering about the kafka-console-producer.sh utility, right? if so, have you tried using the parameter: --producer.config /path/to/the/config/producer.properties Just let me

Re: RecordTooLargeException on 16M messages in Kafka?

2019-08-15 Thread l vic
setting max.request.size worked when i did it in producer code; don't understand what producer.properties is for, seems like it's not used. On Thu, Aug 15, 2019 at 2:25 PM Jonathan Santilli < jonathansanti...@gmail.com> wrote: > I have asked because I did not see that in your previous email when

Re: RecordTooLargeException on 16M messages in Kafka?

2019-08-15 Thread Jonathan Santilli
I have asked because I did not see that in your previous email when you tried the console producer. Jonathan. On Thu, Aug 15, 2019, 3:07 PM l vic wrote: > yes, in producer.properties > > On Thu, Aug 15, 2019 at 9:59 AM Jonathan Santilli < > jonathansanti...@gmail.com> wrote: > > > Just to be su

Re: RecordTooLargeException on 16M messages in Kafka?

2019-08-15 Thread l vic
yes, in producer.properties On Thu, Aug 15, 2019 at 9:59 AM Jonathan Santilli < jonathansanti...@gmail.com> wrote: > Just to be sure, please confirm the configuration parameter is well > set/configure at producer level: > > max.request.size = 12390 (for instance) > > Cheers! > -- > Jonathan >

Re: RecordTooLargeException on 16M messages in Kafka?

2019-08-15 Thread Jonathan Santilli
Just to be sure, please confirm the configuration parameter is well set/configure at producer level: max.request.size = 12390 (for instance) Cheers! -- Jonathan On Thu, Aug 15, 2019 at 1:44 PM l vic wrote: > I tested it with kafka-console-consumer and kafka-console-producer reading > fr

Re: RecordTooLargeException on 16M messages in Kafka?

2019-08-15 Thread claude.war...@wipro.com.INVALID
. I have plans to do so in the future. From: l vic Sent: Thursday, August 15, 2019 13:48 To: users@kafka.apache.org Subject: Re: RecordTooLargeException on 16M messages in Kafka? ** This mail has been sent from an external source. Treat hyperlinks and attachments

Re: RecordTooLargeException on 16M messages in Kafka?

2019-08-15 Thread l vic
I tested it with kafka-console-consumer and kafka-console-producer reading from 16M text file (no newlines): kafka-console-producer.sh --broker-list :6667 --topic test < ./large-file The error comes out on producer side: org.apache.kafka.common.errors.RecordTooLargeException: The message is 16

Re: RecordTooLargeException on 16M messages in Kafka?

2019-08-15 Thread l vic
I tested it with kafka-console-consumer and kafka-console-producer reading from large text file (no newlines): kafka-console-producer.sh --broker-list 10.10.105.24:6667 --topic test < ./large-file On Thu, Aug 15, 2019 at 4:49 AM Jonathan Santilli < jonathansanti...@gmail.com> wrote: > Hello

Re: RecordTooLargeException on 16M messages in Kafka?

2019-08-15 Thread l vic
Yes, it's still there On Thu, Aug 15, 2019 at 4:49 AM Jonathan Santilli < jonathansanti...@gmail.com> wrote: > Hello, try to send and flush just one message of 16777239 bytes, to verify > the error still shows up. > > Cheers! > -- > Jonathan > > > > On Thu, Aug 15, 2019 at 2:23 AM l vic wrote: >

Re: RecordTooLargeException on 16M messages in Kafka?

2019-08-15 Thread Jonathan Santilli
Hello, try to send and flush just one message of 16777239 bytes, to verify the error still shows up. Cheers! -- Jonathan On Thu, Aug 15, 2019 at 2:23 AM l vic wrote: > My kafka (1.0.0) producer errors out on large (16M) messages. > ERROR Error when sending message to topic test with key: nul

RecordTooLargeException on 16M messages in Kafka?

2019-08-14 Thread l vic
My kafka (1.0.0) producer errors out on large (16M) messages. ERROR Error when sending message to topic test with key: null, value: 16777239 bytes with error: (org.apache.kafka.clients.producer.internals. ErrorLoggingCallback) org.apache.kafka.common.errors.RecordTooLargeException: The message is

Re: RecordTooLargeException

2018-07-20 Thread Jerry Richardson
Setting max.request.size does not prevent RecordTooLargeException from being thrown.   Thanks for the header info. 9. Jul 2018 09:20 by jiangtao@zuora.com <mailto:jiangtao@zuora.com>.INVALID: > ​you can configure max.request.size > <> https://kafka.apache

Re: RecordTooLargeException

2018-07-09 Thread Tony Liu
> than 1MB, I want to truncate or throw away the message to avoid the > RecordTooLargeException. What is the max size of the headers? Where is > the key size + value size + header size calculated? > > 5. Jul 2018 14:15 by jiangtao@zuora.com <mailto:jiangtao@zuora.com &g

Re: RecordTooLargeException

2018-07-06 Thread Jerry Richardson
My administrator will not allow messages larger than 1MB to be stored in Kafka. How can I limit the size of my messages to 1MB? If I have a message larger than 1MB, I want to truncate or throw away the message to avoid the RecordTooLargeException.  What is the max size of the headers?  Where is

Re: RecordTooLargeException

2018-07-05 Thread Tony Liu
sorry, just saw your email a little late. I am confused with why you said `I cannot increase the max size of messages stored in Kafka` ? can we add some explanation ? On Thu, Jul 5, 2018 at 10:09 AM, Jerry Richardson < jerryrichard...@tutanota.com> wrote: > > What class is this in? What's the m

Re: RecordTooLargeException

2018-07-05 Thread Jerry Richardson
What class is this in?  What's the maximum header size? Is there documentation on this? 2. Jul 2018 18:29 by jiangtao@zuora.com .INVALID: > You can consider increase `max.request.size` ​a little big (the default > value is `1048576`.), after checking Kafka c

Re: RecordTooLargeException

2018-07-03 Thread jerryrichardson
I cannot increase the max size of messages stored in Kafka.  How do I limit them to avoid the RecordTooLargeException? 2. Jul 2018 18:29 by jiangtao@zuora.com <mailto:jiangtao@zuora.com>.INVALID: > You can consider increase `max.request.size` ​a little big (the default &

Re: RecordTooLargeException

2018-07-02 Thread Tony Liu
You can consider increase `max.request.size` ​a little big (the default value is `1048576`.), after checking Kafka client source code, they count [`key size` + `value size` + `header size` + others] together, so it's possible the calculated size is a little bigger than the default value. please ch

RecordTooLargeException

2018-07-02 Thread jerryrichardson
Hi all, I get this error even when my records are smaller than the 112 byte limit: org.apache.kafka.common.errors.RecordTooLargeException: The request included a message larger than the max message size the server will accept. How do I ensure that my producer doesn't send records that are t

RecordTooLargeException

2018-07-02 Thread jerryrichardson
Hi all, I get this error even when my records are smaller than the 112 byte limit: org.apache.kafka.common.errors.RecordTooLargeException: The request included a message larger than the max message size the server will accept. How do I ensure that my producer doesn't send records that are t

Consumer group stuck rebalancing with RecordTooLargeException

2017-03-07 Thread Robert Quinlivan
Hello, I have a consumer group that is continually stuck in a reblance with the following error being produced in the broker logs: [2017-03-07 22:16:20,473] ERROR [Group Metadata Manager on Broker 0]: Appending metadata message for group tagfire_application generation 951 failed due to org.apache