Dear All,
I am using Kafka cluster 2.11_0.9.0.1, and the new consumer of 2.11_0.9.0.1.
When I set the quota configuration is:
quota.producer.default=1000000
quota.consumer.default=1000000
And I used the new consumer to consume data, then the error happened
sometimes:
org.apache.kafka.common.protocol.types.SchemaException: Error reading field
'responses': Error reading array of size 1140343, only 37 bytes available
at org.apache.kafka.common.protocol.types.Schema.read(Schema.java:73)
at
org.apache.kafka.clients.NetworkClient.handleCompletedReceives(NetworkClient.java:439)
at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:265)
at
org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.clientPoll(ConsumerNetworkClient.java:320)
at
org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.poll(ConsumerNetworkClient.java:213)
at
org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.poll(ConsumerNetworkClient.java:193)
at
org.apache.kafka.clients.consumer.KafkaConsumer.pollOnce(KafkaConsumer.java:908)
at
org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:853)
at com.fw.kafka.ConsumerThread.run(TimeOffsetPair.java:458)
It is not occurred every time, but when it happened, it occurs repeatedly
many times.