Thanks, Yun. Makes sense. How would you reference a jar file from inside of
another jar for such invocation?
In my case I would have an interactive application - spring boot web app -
where the job would be configured and
StreamExecutionEnvironment.execute(jobName)
is called.
Spring app is a runnable fat jar with my "flink" jar packaged along with
other jars. How would I specify location to the jar so that
StreamExecutionEnvironment
can find it?
Thanks,
Alex

On Wed, Oct 28, 2020 at 11:03 PM Yun Gao <yungao...@aliyun.com> wrote:

> Hi Alexander,
>
>  From my side I still think it should be reasonable to have a jar that
> contains the code that are running in the clients and also shipped to the
> cluster. Then this jar could also be included in the shipping jar list.
>
>  For the second issue, similarly I think you may first build the project
> to get the jar containing the code, then fill the path of the generated jar
> in to test the submitting.
>
> Best,
>  Yun
>
>
> ------------------Original Mail ------------------
> *Sender:*Alexander Bagerman <bager...@gmail.com>
> *Send Date:*Thu Oct 29 11:38:45 2020
> *Recipients:*Yun Gao <yungao...@aliyun.com>
> *CC:*Flink ML <user@flink.apache.org>
> *Subject:*Re: How to deploy dynamically generated flink jobs?
>
>> I did try it but this option seems to be for a third party jar. In my
>> case I would need to specify/ship a jar that contains the code where job is
>> being constracted. I'm not clear of
>> 1. how to point to the containg jar
>> 2. how to test such a submission from my project running in Eclipse
>> Alex
>>
>> On Wed, Oct 28, 2020 at 8:21 PM Yun Gao <yungao...@aliyun.com> wrote:
>>
>>> Hi Alexander,
>>>
>>> The signature of the createRemoteEnvironment is
>>>
>>> public static StreamExecutionEnvironment createRemoteEnvironment(
>>>       String host, int port, String... jarFiles);
>>>
>>> Which could also ship the jars to execute to remote cluster. Could you have 
>>> a try to also pass the jar files to the remote environment ?
>>>
>>>
>>> Best,
>>>
>>>  Yun
>>>
>>> ------------------------------------------------------------------
>>> Sender:Alexander Bagerman<bager...@gmail.com>
>>> Date:2020/10/29 10:43:16
>>> Recipient:<user@flink.apache.org>
>>> Theme:How to deploy dynamically generated flink jobs?
>>>
>>>
>>>
>>> Hi,
>>>
>>> I am trying to build a functionality to dynamically configure a flink
>>> job (Java) in my code based on some additional metadata and submit it to a
>>> flink running in a session cluster.
>>>
>>> Flink version is 1.11.2
>>>
>>> The problem I have is how to provide a packed job to the cluster. When I
>>> am trying the following code
>>>
>>> StreamExecutionEnvironment env = 
>>> StreamExecutionEnvironment.createRemoteEnvironment(hostName, hostPort);... 
>>> configuring job workflow here...env.execute(jobName);
>>>
>>> I am getting ClassNotFoundException stating that code for my mapping
>>> functions did not make it to the cluster. Which makes sense.
>>>
>>> What would be the right way to deploy dynamically configured flink jobs
>>> which are not packaged as a jar file but rather generated ad-hoc?
>>>
>>> Thanks
>>>
>>>

Reply via email to