I did try it but this option seems to be for a third party jar. In my case
I would need to specify/ship a jar that contains the code where job is
being constracted. I'm not clear of
1. how to point to the containg jar
2. how to test such a submission from my project running in Eclipse
Alex

On Wed, Oct 28, 2020 at 8:21 PM Yun Gao <yungao...@aliyun.com> wrote:

> Hi Alexander,
>
> The signature of the createRemoteEnvironment is
>
> public static StreamExecutionEnvironment createRemoteEnvironment(
>       String host, int port, String... jarFiles);
>
> Which could also ship the jars to execute to remote cluster. Could you have a 
> try to also pass the jar files to the remote environment ?
>
>
> Best,
>
>  Yun
>
> ------------------------------------------------------------------
> Sender:Alexander Bagerman<bager...@gmail.com>
> Date:2020/10/29 10:43:16
> Recipient:<user@flink.apache.org>
> Theme:How to deploy dynamically generated flink jobs?
>
>
>
> Hi,
>
> I am trying to build a functionality to dynamically configure a flink job
> (Java) in my code based on some additional metadata and submit it to a
> flink running in a session cluster.
>
> Flink version is 1.11.2
>
> The problem I have is how to provide a packed job to the cluster. When I
> am trying the following code
>
> StreamExecutionEnvironment env = 
> StreamExecutionEnvironment.createRemoteEnvironment(hostName, hostPort);
> ... configuring job workflow here...
> env.execute(jobName);
>
> I am getting ClassNotFoundException stating that code for my mapping
> functions did not make it to the cluster. Which makes sense.
>
> What would be the right way to deploy dynamically configured flink jobs
> which are not packaged as a jar file but rather generated ad-hoc?
>
> Thanks
>
>

Reply via email to