Hi Asmath
When you create a connector the request is forwarded onto the rest api of the 
worker which is the leader, it uses a Kafka producer to update the topic which 
stores your connector config. The other workers will continuously monitor this 
config topic and know if it changes then then they need to perform a rebalance 
since a new connector has been added. When your workers start they will read 
the config from this toopic and keep it in memory so they know the current 
configuration of the connectors. This topic and the other internal topics used 
for Kafka connect should use compaction so that they aren't removed after the 
retention time, the most recent version of each connectors config remains.
For alerting specifically on connector status you could use the REST API to get 
the status of each connector and it's tasks at regular intervals and alert if 
the status is failed on any (or define threshold of an acceptable number of 
failures )
Hope this helps, 
Jamie


-----Original Message-----
From: KhajaAsmath Mohammed <mdkhajaasm...@gmail.com>
To: users <users@kafka.apache.org>
Sent: Thu, Oct 24, 2019 03:52 PM
Subject: Monitor Kafka connect jobs


Hi,


We are using kafka connect in production and I have few questions about it.
when we submit kafka connect job using rest api . Job gets continously
running in the background and due to some issues, lets say we restarted
kafka cluster. do we need to start manually all the jobs again?


Is there a way to monitor these jobs using tools. I know we can use connect
UI but in case if we have more than 1000 jobs it would become more complex.


I am also looking for trigger mechanism to send out email or alert support
team if the connector was killed due to some reasons.


Thanks,

Asmath

Reply via email to