Hi,
I am trying to build a functionality to dynamically configure a flink job
(Java) in my code based on some additional metadata and submit it to a
flink running in a session cluster.
Flink version is 1.11.2
The problem I have is how to provide a packed job to the cluster. When I am
trying the
I did try it but this option seems to be for a third party jar. In my case
I would need to specify/ship a jar that contains the code where job is
being constracted. I'm not clear of
1. how to point to the containg jar
2. how to test such a submission from my project running in Eclipse
Alex
On Wed,
he code, then fill the path of the generated jar
> in to test the submitting.
>
> Best,
> Yun
>
>
> ------Original Mail --
> *Sender:*Alexander Bagerman
> *Send Date:*Thu Oct 29 11:38:45 2020
> *Recipients:*Yun Gao
> *CC:*Flink ML
> *S
Hi,
I added my custom jar (that includes dependencies on Jackson) to Flink
classpath. It seems to be loaded just fine. But when the job starts I am
getting an exception below. I am sure how to interpret the exception though
and would appreciate it if somebody gives me advice on it.
Thanks
Alex
202
ctMapper as a non-transient field? If so, please make it
> transient and initialize in open() of a Rich*Function.
>
> On Fri, Nov 20, 2020 at 7:56 PM Alexander Bagerman
> wrote:
>
>> Hi,
>> I added my custom jar (that includes dependencies on Jackson) to Flink
>> c
in your IDE.
>
> Additionally, double-check that you have no lambdas/anonymous classes that
> reference outer classes with ObjectMapper. ObjectMapper should also be
> static as it's fully immutable, so you can also check that.
>
> On Fri, Nov 20, 2020 at 8:55 PM Alexander Bagerman
> wrote
?
>
> Maybe a different deployment mode would be more appropriate? [1]
>
> Alternatively, if you want to go the hacky route, you could also try to
> shade your dependencies.
>
> [1] https://flink.apache.org/news/2020/07/14/application-mode.html
>
> On Fri, Nov 20, 2020