Hello Christian,

I'm happy to hear that you are trying out StateFun and like the toolset!

Currently StateFun supports "out of the box" only Kafka/Kinesis egresses,
simply because so far folks didn't requested anything else. I can create a
JIRA issue for that and we'll see how the community responds.

Meanwhile, exposing existing Flink connectors as Sinks, is also possible
using the link you provided.
You can see for example our e2e test does it [1]

The way it works is:
1. You indeed need to create a Java application that depends on the
specific Flink connector that you are using.

2. The application needs to contain a StatefulFunctionModule that binds
this Egress.

3. Then you create a JAR and you can start statefun using the official
Docker image: apache/flink-statefun by mounting your module into the
modules/ path, for example:
/opt/statefun/modules/my_module/
Alternatively you can create your own Docker image that derives from
StateFun but only adds that jar into the modules directory. [2]

I hope that it helps,
Igal

[1]
https://github.com/apache/flink-statefun/blob/master/statefun-e2e-tests/statefun-smoke-e2e-driver/src/main/java/org/apache/flink/statefun/e2e/smoke/driver/DriverModule.java#L40

[2]
https://github.com/apache/flink-statefun/blob/master/statefun-e2e-tests/statefun-smoke-e2e-embedded/src/test/resources/Dockerfile#L20

On Tue 28. Sep 2021 at 07:40, Christian Krudewig (Corporate Development) <
christian.krude...@dpdhl.com> wrote:

> Hello Roman,
>
> Well, if that's the way to do it, I can manage to maintain a fork of the
> statefun repo with these tiny changes. But first my question is if that is
> the way it should be done? Or if there is another way to activate these
> connectors.
>
> Best,
>
> Christian
>
> -----Ursprüngliche Nachricht-----
> Von: Roman Khachatryan <ro...@apache.org>
> Gesendet: Dienstag, 28. September 2021 00:31
> An: Christian Krudewig (Corporate Development) <
> christian.krude...@dpdhl.com>; Igal Shilman <i...@ververica.com>
> Cc: user@flink.apache.org
> Betreff: Re: How to add Flink a Flink connector to stateful functions
>
> Hi,
>
> > Does that mean that I need to build the stateful functions java
> application and afterwards the docker image?
> Yes, you have to rebuild the application after updating the pom, as well
> as its docker image.
>
> Is your concern related to synchronizing local docker images with the
> official repo?
> If so, wouldn't using a specific statefun image version solve this issue?
>
> Regards,
> Roman
>
> On Mon, Sep 27, 2021 at 9:29 PM Christian Krudewig (Corporate
> Development) <christian.krude...@dpdhl.com> wrote:
> >
> > Hello everyone,
> >
> >
> >
> > Currently I’m busy setting up a pipeline with Stateful Functions using a
> deployment of the standard docker image “apache/flink-statefun” to
> kubernetes. It has been going smoothly so far and I love the whole toolset.
> But now I want to add Egress modules for both Opensearch (= ElasticSearch
> protocol) and ScyllaDB (= Cassandra protocol). The documentation at
> https://ci.apache.org/projects/flink/flink-statefun-docs-master/docs/io-module/flink-connectors/
> indicates that I can somehow simply plug in the standard Flink datastream
> connectors for ElasticSearch and Cassandra. But I didn’t get how exactly.
> >
> > It says “include the dependency in your pom”. Does that mean that I need
> to build the stateful functions java application and afterwards the docker
> image? That would be a bit unfortunate in terms of long-term maintenance
> effort, because I would need to keep my local checkout in sync with the
> official repositories and rebuild every now and then. Maybe this can also
> be added on top of the existing docker image by adding some jar file to
> some magic plugin folder?
> >
> >
> >
> > Sorry, I hope this doesn’t sound captious. I just want to understand and
> do it the right way. Maybe there is also some minimal example? I didn’t
> find any in the playground nor on stackoverflow or the mailing lists.
> >
> >
> >
> > Thanks,
> >
> >
> >
> > Christian Krudewig
> >
> >
> >
> >
>

Reply via email to