Hi, Andrea
If you are running flink cluster on Yarn, the jar
`flink-shaded-hadoop2-uber-1.6.4.jar` should exist in the lib dir of the
flink client, so that it could be uploaded to the Yarn Distributed Cache
and then be available on JM and TM.
And if you are running flink standalone cluster, the j
HI Qiu,
my jar does not contain the class
`org.apache.hadoop.hdfs.protocol.HdfsConstants*`, *but I do expect it is
contained within `flink-shaded-hadoop2-uber-1.6.4.jar` which is located in
Flink cluster libs.
Il giorno gio 27 giu 2019 alle ore 04:03 Congxian Qiu <
qcx978132...@gmail.com> ha scrit
Hi Andrea
As the NoClassDefFoundError, could you please verify that there exist
`org.apache.hadoop.hdfs.protocol.HdfsConstants*` *in your jar.
Or could you use Arthas[1] to check if there exists the class when running
the job?
[1] https://github.com/alibaba/arthas
Best,
Congxian
Andrea Spina
Dear community,
I'm trying to use HDFS checkpoints in flink-1.6.4 with the following
configuration
state.backend: rocksdb
state.checkpoints.dir: hdfs://
rbl1.stage.certilogo.radicalbit.io:8020/flink/checkpoint
state.savepoints.dir: hdfs://
rbl1.stage.certilogo.radicalbit.io:8020/flink/savepoints