We are using latest Kafka and Logstash versions for ingesting several business apps(now few but eventually 100+) log data into ELK. We have a standardized logging structure for business apps to log data into Kafka topic and able to ingest into ELK via Kafka input plugin.
Currently, we are using one kafka topic for each business app for pushing data into logstash. We have 3 logstash consumers with 3 partitions on each topic. I am wondering about the best practice for using kafka/logstash. Is the above config a good approach or is there better approach. For example, instead of having one kafka topic for each app, should we one kafka topic across all apps? What are the pros and cons? If you are not familiar with Logstash it is part of Elastic stack and it is just another consumer for Kafka. Would appreciate your input! -- Thanks, Ram