bumping up that issue, as i have similar problem now.
We are running flink on Yarn and trying to submit job via java api using
YarnClusterClient (run method with PackagedProgram). Job starts to execute
(we can see it on Dashboard) but fails with error:
Caused by: java.lang.RuntimeException: Coul
Hi Flavio,
I think that the /opt folder only contains optional packages which you can
move into /lib in order to be loaded with your Flink cluster. What Fabian
was referring to is to make it easier for the user to find this package so
that he doesn't have to download it by himself.
Cheers,
Till
I faced this problem yesterday and putting flink-hadoop-compatibility under
flink/lib folder solved the problem for me.
But what is the official recommendation? Should I put it into lib or opt
folder?
Is there any difference from a class-loading point of view?
Best,
Flavio
On Fri, Apr 7, 2017 at
Hey Fabi,
many thanks for your clarifications! It seems flink-shaded-hadoop2
itself is already included in the binary distribution:
> $ jar tf flink-1.2.0/lib/flink-dist_2.10-1.2.0.jar | grep org/apache/hadoop |
> head -n3
> org/apache/hadoop/
> org/apache/hadoop/fs/
> org/apache/hadoop/fs/FileS
Hi Petr,
I think that's an expected behavior because the exception is intercepted
and enriched with an instruction to solve the problem.
As you assumed, you need to add the flink-hadoop-compatibility JAR file to
the ./lib folder. Unfortunately, the file is not included in the binary
distribution.
Hello,
with 1.2.0 `WritableTypeInfo` got moved into its own artifact
(flink-hadoop-compatibility_2.10-1.2.0.jar). Unlike with 1.1.0, the
distribution jar `flink-dist_2.10-1.2.0.jar` does not include the hadoop
compatibility classes anymore. However, `TypeExtractor` which is part of
the distributio