Hi,
We have been looking at using stateful functions to deploy a remote
python model as a stateful and interacting with it from Flink via
Kafka.
Everything has worked well until we ran into some in-house deployment
issues around the various environments.
This coupled with the use case (where we
find
> it there.
>
> Generally flink-conf.yaml should be part of your Flink runtime. For example
> a file at /opt/flink/conf/flink-conf.yaml
>
> Thanks,
> Igal.
>
>
>
> On Thu, Sep 23, 2021 at 11:22 AM Barry Higgins
> wrote:
>
> > Hi Igal,
> > I
re out
> the DataStream integration approach.
>
> All the best,
> Igal.
>
> [1]
> https://mvnrepository.com/artifact/org.apache.flink/statefun-flink-distribution/3.1.0
>
>
>
> On Thu, Sep 9, 2021 at 5:22 PM Barry Higgins
> wrote:
>
> > Hi,
> >
l figure out
> the DataStream integration approach.
>
> All the best,
> Igal.
>
> [1]
> https://mvnrepository.com/artifact/org.apache.flink/statefun-flink-distribution/3.1.0
>
>
>
> On Thu, Sep 9, 2021 at 5:22 PM Barry Higgins
> wrote:
>
> > Hi,
> >
Hi,
I'm looing at using the DataStream API from a Flink application against a
remote python stateful function deployed on another machine. I would like to
investigate how feasible it is to have all of the state management being
handled from the calling side meaning that we don't need another in
]
> https://nightlies.apache.org/flink/flink-statefun-docs-release-3.1/docs/sdk/flink-datastream/
>
>
> On Thu, Sep 2, 2021 at 1:07 PM Barry Higgins
> wrote:
>
> > Hi,
> >
> > I have set up a remote stateful function in python which I’ve deployed
> > on an A