Hi Himanshu,

The short answer is you should configure Stateful Functions in your job. Here 
is an example 
https://github.com/f1xmAn/era-locator/blob/34dc4f77539195876124fe604cf64c61ced4e5da/src/main/java/com/github/f1xman/era/StreamingJob.java#L68.

Check out this article on Flink DataStream and Stateful Functions 
interoperability 
https://medium.com/devoops-and-universe/realtime-detection-of-russian-crypto-phone-era-with-flink-datastream-and-stateful-functions-e77794fedc2a.

Best,
Tymur Yarosh
On 24 May 2022, 21:16 +0300, Himanshu Sareen <himanshusar...@outlook.com>, 
wrote:
> Team,
>
> I'm working on a POC where our existing Stateful Functions ( remote ) can 
> interact with Datastream API.
> https://nightlies.apache.org/flink/flink-statefun-docs-release-3.2/docs/sdk/flink-datastream/
>
> I started Flink cluster - ./bin/start-cluster.sh
> Then I submitted the .jar to Flink.
>
> However, on submitting only Embedded function is called by Datastream code.
>
> I'm unable to invoke stateful functions as module.yaml is not loaded.
>
> Can someone help me in understanding how can we deploy Stateful function code 
> (module.yaml) and Datastream api code parllely on Flink cluster.
>
>
> Regards
> Himanshu
>

Reply via email to