[ 
https://issues.apache.org/jira/browse/KAFKA-1077?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13789938#comment-13789938
 ] 

Xiejing commented on KAFKA-1077:
--------------------------------

Thanks Rao. The error was thrown in BoundedByteBufferReceive.scala, which seems 
has protection mechanism

      if(size > maxSize)
        throw new InvalidRequestException("Request of length %d is not valid, 
it is larger than the maximum size of %d bytes.".format(size, maxSize))
      contentBuffer = byteBufferAllocate(size)

So what makes me confused is, in my case, 'size' is 850M and 'maxSize' is 100M, 
why InvalidRequestException hasn't been thrown?

> OutOfMemoryError when consume large messaages
> ---------------------------------------------
>
>                 Key: KAFKA-1077
>                 URL: https://issues.apache.org/jira/browse/KAFKA-1077
>             Project: Kafka
>          Issue Type: Bug
>          Components: config, network
>            Reporter: Xiejing
>            Assignee: Jun Rao
>
> We set 'socket.request.max.bytes'  to 100 * 1024 * 1024, but still see 
> OutOfMemoryError when consuming messages(size 1M).
> e.g.  
> [08/10/13 05:44:47:047 AM EDT] 102 ERROR network.BoundedByteBufferReceive: 
> OOME with size 858861616
> java.lang.OutOfMemoryError: Java heap space 
> 858861616 is much larger than 100 * 1024 * 1024 but no 
> InvalidRequestException is thrown in BoundedByteBufferReceive



--
This message was sent by Atlassian JIRA
(v6.1#6144)

Reply via email to