Hello,

I’m trying to use HDFS as store for Flink checkpoints so I downloaded the 
Hadoop 2.6.0/Scala 2.10 version of Flink and installed it. I also downloaded 
Hadoop 2.6.0 separately from the Hadoop website and set up HDFS on a separate 
machine. When I start Flink I get the following error:

17:34:13,047 INFO  org.apache.flink.runtime.jobmanager.JobManager               
 - Status of job 9ba32a08bc0ec02810bf5d2710842f72 (Protocol Event Processing) 
changed to FAILED.
java.lang.Exception: Call to registerInputOutput() of invokable failed
        at org.apache.flink.runtime.taskmanager.Task.run(Task.java:529)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: The given file URI 
(hdfs://10.13.182.171:9000/user/flink/checkpoints) points to the HDFS NameNode 
at 10.13.182.171:9000, but the File System could not be initialized with that 
address: Server IPC version 9 cannot communicate with client version 4
        at 
org.apache.flink.runtime.fs.hdfs.HadoopFileSystem.initialize(HadoopFileSystem.java:337)
        at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:253)
        at 
org.apache.flink.runtime.state.filesystem.FsStateBackend.<init>(FsStateBackend.java:142)
        at 
org.apache.flink.runtime.state.filesystem.FsStateBackend.<init>(FsStateBackend.java:101)
        at 
org.apache.flink.runtime.state.filesystem.FsStateBackendFactory.createFromConfig(FsStateBackendFactory.java:48)
        at 
org.apache.flink.streaming.runtime.tasks.StreamTask.createStateBackend(StreamTask.java:517)
        at 
org.apache.flink.streaming.runtime.tasks.StreamTask.registerInputOutput(StreamTask.java:171)
        at org.apache.flink.runtime.taskmanager.Task.run(Task.java:526)
        ... 1 more
Caused by: 
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.RPC$VersionMismatch):
 Server IPC version 9 cannot communicate with client version 4
        at org.apache.hadoop.ipc.Client.call(Client.java:1113)
        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)
        at com.sun.proxy.$Proxy6.getProtocolVersion(Unknown Source)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62)
        at com.sun.proxy.$Proxy6.getProtocolVersion(Unknown Source)
        at org.apache.hadoop.ipc.RPC.checkVersion(RPC.java:422)
        at org.apache.hadoop.hdfs.DFSClient.createNamenode(DFSClient.java:183)
        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:281)
        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:245)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:100)
        at 
org.apache.flink.runtime.fs.hdfs.HadoopFileSystem.initialize(HadoopFileSystem.java:321)
        ... 8 more

I searched for this error online and it indicates that the client which is 
Flink in this case is at a much lower version. Is there a way to check the 
version of Hadoop packaged with my Flink installation?

Thanks,
Ali

Reply via email to