1GB sounds like a tad steep, you may want to do some testing, as Kafka
needs to be told that such large messages can arrive and broker will then
pre-allocate buffers for that. Personally, I'd stop short of low megabytes,
anything bigger can be dropped off in e.g. S3 and then you just queue a
link for further processing.

I'm not saying it's impossible, Kafka handles large messages better than
most other tools out there, but you do want to do a test setup to make sure
that it'll handle the sort of traffic you fling at it in any case.

On Fri, Mar 4, 2016 at 4:26 AM, Mahesh Dharmasena <mahesh....@gmail.com>
wrote:

> We have a client with several thousand stores which send and receive
> messages to main system that resides on the headquarters.
>
> A single Store sends and receive around 50 to 100 messages per day.
>
> Average Message size could be from 2KB to 1GB.
>
> Please let me know whether I can adapt Apache Kafka for the solution?
>
>
> - Mahesh.
>



-- 

*Cees de Groot*
PRINCIPAL SOFTWARE ENGINEER
[image: PagerDuty logo] <http://pagerduty.com/>
pagerduty.com
c...@pagerduty.com <m...@pagerduty.com>
+1(416)435-4085

[image: Twitter] <http://twitter.com/pagerduty>[image: FaceBook]
<https://www.facebook.com/PagerDuty>[image: Google+]
<https://plus.google.com/114339089137644062989>[image: LinkedIn]
<https://www.linkedin.com/company/pagerduty>[image: Blog]
<https://blog.pagerduty.com/>

Reply via email to