Hi Paul,
> In my deployment, the hive connector (and its orc dependencies), which
contains `DefaultBucketFactoryImpl`, was packaged into the job uber jar.
On the other hand, the flink runtime, which contains `
HadoopPathBasedBulkFormatBuilder ` is located in the lib folder. Since the
two jars are
Hi Jingsong,
Thanks to your pointer, I checked the dependencies and found out that it’s
caused by the classloaders.
In my deployment, the hive connector (and its orc dependencies), which contains
`DefaultBucketFactoryImpl`, was packaged into the job uber jar. On the other
hand, the flink runt
Hi, It looks really weird.
Is there any possibility of class conflict?
How do you manage your dependencies? Do you download bundle-jar to lib? [1]
[1]
https://ci.apache.org/projects/flink/flink-docs-release-1.11/dev/table/hive/#using-bundled-hive-jar
Best,
Jingsong
On Mon, Jul 13, 2020 at 5:48
Hi,
I’m trying out Flink 1.11 and trying to write data to Hive orc tables, but get
stuck with a weird exception. Wonder if anyone had met this before? The Java
version is 1.8.0_151.
java.lang.IllegalAccessError: tried to access class
org.apache.flink.streaming.api.functions.sink.filesystem.De