Hi Namita and Pasi, Logstash as middlemen is good if and only if: 1. You dont need buffer in between and you are Ok with tight coupling between source and destination. 2. There are sufficient number of logstash servers available in case of large data. In case of logstash, source and destinations should be always available. If MongoDB is not available, there are chances of mossing data. In case of kafka as broker, you have chances of recovery of failed ingestions. And there many other benefits of using kafka.
Regards, Sunil. On Sat, 1 Oct 2022 at 4:27 PM, Pasi Hiltunen <pasi.hiltu...@netum.fi> wrote: > Hi Namita, > > You can use Logstash as middleman: > > https://www.elastic.co/guide/en/logstash/current/plugins-inputs-elasticsearch.html > https://www.elastic.co/guide/en/logstash/current/plugins-outputs-kafka.html > > -pasi > > > From: Namita Jaokar <jaokarnami...@gmail.com> > Date: Friday, 30. September 2022 at 19.56 > To: users@kafka.apache.org <users@kafka.apache.org> > Subject: Apache Kafka Connect > Hi All, > > I have a scenario where I want to send data from elasticsearch to Mongodb > through kafka and while researching I came across Kafka connect. > > Through Kafka connect is it possible to have the elasticsearch as a source > connector that will send data/messages from elasticsearch to kafka? I came > across Kafka sink connector which can receive messages/Data from the Kafka > server through topics. > > In case of Mongodb sink connector, what would be the behaviour if I have > larger data that is greater than the maximum size of document in mongodb > which is 16MB. Is there a way to handle this? > > Also, Is there a prerequisite to have docker setup before installing the > connectors > > Thanks & Regards, > Namita >