Sorry, a job graph is solid while we compile it before submitting to the 
cluster, not dynamic as what you want.

You did can write some wrapper operators which response to your own PRCs to run 
the appended operators you want,
But the you should keep the consistency semantics by yourself.

Best,
Danny Chan
在 2020年6月28日 +0800 PM3:30,C DINESH <dinesh.kitt...@gmail.com>,写道:
> Hi All,
>
> In a flink job I have a pipeline. It is consuming data from one kafka topic 
> and storing data to Elastic search cluster.
>
> without restarting the job can we add another kafka cluster and another 
> elastic search sink to the job. Which means i will supply the new kafka 
> cluster and elastic search details in the topic.  After consuming the data 
> can our flink job add the new source and sink to the same job.
>
>
> Thanks & Regards,
> Dinesh.

Reply via email to