By any chance has anyone worked with using Kafka with message sizes that are 
approximately 50MB in size?  Based on from some of the previous threads there 
are probably some concerns on memory pressure due to the compression on the 
broker and decompression on the consumer and a best practices on ensuring batch 
size (to ultimately not have the compressed message exceed message size limit). 
 

Any other best practices or thoughts concerning this scenario?

Thanks!
Denny

Reply via email to