Hi Superainbower,
could you share the complete logs with us? They contain which Flink version
you are using and also the classpath you are starting the JVM with. Have
you tried whether the same problem occurs with the latest Flink version?
Cheers,
Till
On Mon, Oct 12, 2020 at 10:32 AM superainbo
Hi Till,
Could u tell me how to configure HDFS as statebackend when I deploy flink on
k8s?
I try to add the following to flink-conf.yaml
state.backend: rocksdb
state.checkpoints.dir: hdfs://slave2:8020/flink/checkpoints
state.savepoints.dir: hdfs://slave2:8020/flink/savepoints
state.backend.incr
Hi Saksham,
if you want to extend the Flink Docker image you can find here more details
[1].
If you want to include the library in your user jar, then you have to add
the library as a dependency to your pom.xml file and enable the shade
plugin for building an uber jar [2].
[1]
https://ci.apache.
HI Saksham,
the easiest approach would probably be to include the required libraries in
your user code jar which you submit to the cluster. Using maven's shade
plugin should help with this task. Alternatively, you could also create a
custom Flink Docker image where you add the required libraries t
Hi ,
i have made some configuration using this link page :
https://ci.apache.org/projects/flink/flink-docs-release-1.11/ops/deployment/kubernetes.html
.
and i am able to run flink on UI , but i need to submit a job using :
http://localhost:8001/api/v1/namespaces/default/services/flink-jobmanager:w