Hi,

We have a simple stateful-function job, consuming from Kafka, calling
an HTTP endpoint (on AWS via an Elastic Load Balancer) and publishing
the result back via Kafka again.

* We created a jar file to be deployed on a standalone cluster (it's
not a docker Image), therefore we add `statefun-flink-distribution`
version 3.0.0 as a dependency in that jar file.
* Entry class in our job configuration is
`org.apache.flink.statefun.flink.core.StatefulFunctionsJob` and we
simply keep a single module.yaml file in resources folder for the
module configuration.

My question here is, we would like to deploy that jar to different
environments (dev. and prod.) and not sure how we can pass different
module configurations (module.yaml or module_nxt.yaml/module_prd.yaml)
to the job during startup without creating separate jar files for
different environments?

Thanks,
Deniz

Reply via email to