1. Kafka Connect standalone workers have their connectors configured based
on properties file(s) passed on the command line at startup. You cannot use
REST to add or remove them
2. Correct, Standalone workers are isolated instances that cannot share
load with other workers
3. Correct, Distributed workers in a cluster will distributed connector
tasks across the available workers and rebalance on the loss of a worker
4. Correct, in Distributed mode you have to use the REST interface (see
https://rmoff.dev/kafka-connect-rest-api), you cannot use properties file
to configure connectors. n.b. You still have a properties file to configure
the worker itself.


-- 

Robin Moffatt | Senior Developer Advocate | ro...@confluent.io | @rmoff


On Tue, 23 Mar 2021 at 02:02, Himanshu Shukla <himanshushukla...@gmail.com>
wrote:

> Hi All,
>
> I am having these below understanding regarding Kafka connect.
>
> 1. Kafka Connect Standalone has the provision of either running the job
> from the command line or we can use the REST interface also to
> add/update/delete a connector job.
>
> 2. Standalone mode won't be running like a clustered environment like it
> won't be sharing the load if more than one instance is running. Each
> Instance will be responsible for it's own configured connector job and load
> sharing won't happen.
>
> 3. Distributed mode will share the load among the instances and will
> provide fault tolerance, dynamic scaling, etc.
>
> 4. Distribute mode only provides the connector job configuration through
> the REST interface. There is no other option like reading the connector job
> config from the property file or reading it from JDBC etc.
>
> Please confirm, if the above understanding is correct.
>
> --
> Regards,
> Himanshu Shukla
>

Reply via email to