The message size limit is imposed on the compressed message. To answer your
question about the effect of large messages - they cause memory pressure on
the Kafka brokers as well as on the consumer since we re-compress messages
on the broker and decompress messages on the consumer.

I'm not so sure that large messages will have a hit on latency since
compressing a few large messages vs compressing lots of small messages with
the same content, should not be any slower. But you want to be careful on
the batch size since you don't want the compressed message to exceed the
message size limit.

Thanks,
Neha


On Mon, Oct 7, 2013 at 9:10 AM, S Ahmed <sahmed1...@gmail.com> wrote:

> I see, so that is one thing to consider is if I have 20 KB messages, I
> shouldn't batch too many together as that will increase latency and the
> memory usage footprint on the producer side of things.
>
>
> On Mon, Oct 7, 2013 at 11:55 AM, Jun Rao <jun...@gmail.com> wrote:
>
> > At LinkedIn, our message size can be 10s of KB. This is mostly because we
> > batch a set of messages and send them as a single compressed message.
> >
> > Thanks,
> >
> > Jun
> >
> >
> > On Mon, Oct 7, 2013 at 7:44 AM, S Ahmed <sahmed1...@gmail.com> wrote:
> >
> > > When people using message queues, the message size is usually pretty
> > small.
> > >
> > > I want to know who out there is using kafka with larger payload sizes?
> > >
> > > In the configuration, the maximum message size by default is set to 1
> > > megabyte (
> > > message.max.bytes1000000)
> > >
> > > My message sizes will be probably be around 20-50 KB but to me that is
> > > large for a message payload so I'm wondering what effects that will
> have
> > > with kafka.
> > >
> >
>

Reply via email to