Thank you Jun,

regards,
Rahul

On Wed, Mar 13, 2013 at 12:21 AM, Jun Rao <jun...@gmail.com> wrote:

> 1. Compression happens on the list of messages.
> 2. We compress a list of messages into a new message and the size of the
> latter has to be smaller than max.message.size and fetch.size in the
> consumer.
> 3. Yes, typically the bigger the list, the better the compression ratio.
> However, there is diminishing return once the list is big enough.
>
> Thanks,
>
> Jun
>
> On Tue, Mar 12, 2013 at 10:10 AM, R S <mypostbo...@gmail.com> wrote:
>
> > I have question regarding GZIP compression.
> >
> > My env: I have enabled Compression.codec=1 essentially GZIP . Iam using
> > ProducerData api on the producer side,
> > We use following call in the code:
> >
> > producer.send(new ProducerData<Integer, String>(
> > topic, messageList));
> >
> > messageList is list of messages , each message is a String.
> > No of messages in messageList is 2000 always.
> > Each message size is 1 KB . So total load in each call of producer.send
> is
> > (2000 * 1 KB)
> >
> > max.message.size  is set to 3000 KB.
> > I use same value on the consumer side using fetch.size = 3000 KB .
> >
> > Question :
> > 1. Will compression compress each message in the list or , will it
> compress
> > whole of the list together as a single blob ?
> > 2. max.message.size  .In kafka will max.message.size checks for each
> > message or does it checks with whole messageList size`?
> > 3. Having a bigger messagelist size will give me better compression??
> >
> > -rahul
> >
>

Reply via email to