Hi All,

I am having some log files of around 30GB, I am trying to event process
these logs by pushing them to Kafka. I could clearly see the throughput
achieved while publishing these event to Kafka is quiet slow.

So as mentioned for the single log file of 30GB, the Logstash is
continuously emitting to Kafka and it is running from more than 2 days but
still it has processed just 60% of the log data. I was looking out for a
way to increase the efficiency of the publishing the event to kafka as with
this rate of data ingestion I don't think it will be a good option to move
ahead.

Looking out for performance improvisation for the same.

Experts advise required!

Thanks!

Reply via email to