Hi Max,

I have the same output for the Task Manager:

11:25:04,274 INFO  org.apache.flink.runtime.taskmanager.TaskManager
      -  Hadoop version: 2.6.0

I do get this line at the beginning of both job and task manager log files:

11:25:04,100 WARN  org.apache.hadoop.util.NativeCodeLoader
      - Unable to load native-hadoop library for your platform... using
builtin-java classes where applicable

Do you think it has anything to do with it?

Thanks,
Ali

On 2015-12-23, 7:30 AM, "Maximilian Michels" <m...@apache.org> wrote:

>Hi Ali,
>
>Could you please also post the Hadoop version output of the task
>manager log files? It looks like the task managers are running a
>different Hadoop version.
>
>Thanks,
>Max
>
>On Tue, Dec 22, 2015 at 4:28 PM, Kashmar, Ali <ali.kash...@emc.com> wrote:
>> Hi Robert,
>>
>> I found the version in the job manager log file:
>>
>> 17:33:49,636 INFO  org.apache.flink.runtime.jobmanager.JobManager
>>       -  Hadoop version: 2.6.0
>>
>> But the Hadoop installation I have is saying this:
>>
>> ubuntu@ubuntu-171:~/Documents/hadoop-2.6.0$ bin/hadoop version
>> Hadoop 2.6.0
>> Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r
>> e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
>> Compiled by jenkins on 2014-11-13T21:10Z
>> Compiled with protoc 2.5.0
>> From source with checksum 18e43357c8f927c0695f1e9522859d6a
>> This command was run using
>> 
>>/home/ubuntu/Documents/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6
>>.0
>> .jar
>>
>>
>> So one of them is lying to me? :)
>>
>> Ali
>>
>> On 2015-12-22, 10:16 AM, "Robert Metzger" <rmetz...@apache.org> wrote:
>>
>>>Hi Ali,
>>>
>>>the TaskManagers and the JobManager is logging the Hadoop version on
>>>startup.
>>>
>>>On Tue, Dec 22, 2015 at 4:10 PM, Kashmar, Ali <ali.kash...@emc.com>
>>>wrote:
>>>
>>>> Hello,
>>>>
>>>> I¹m trying to use HDFS as store for Flink checkpoints so I downloaded
>>>>the
>>>> Hadoop 2.6.0/Scala 2.10 version of Flink and installed it. I also
>>>> downloaded Hadoop 2.6.0 separately from the Hadoop website and set up
>>>>HDFS
>>>> on a separate machine. When I start Flink I get the following error:
>>>>
>>>> 17:34:13,047 INFO  org.apache.flink.runtime.jobmanager.JobManager
>>>>       - Status of job 9ba32a08bc0ec02810bf5d2710842f72 (Protocol Event
>>>> Processing) changed to FAILED.
>>>> java.lang.Exception: Call to registerInputOutput() of invokable failed
>>>>         at 
>>>>org.apache.flink.runtime.taskmanager.Task.run(Task.java:529)
>>>>         at java.lang.Thread.run(Thread.java:745)
>>>> Caused by: java.io.IOException: The given file URI (hdfs://
>>>> 10.13.182.171:9000/user/flink/checkpoints) points to the HDFS NameNode
>>>>at
>>>> 10.13.182.171:9000, but the File System could not be initialized with
>>>> that address: Server IPC version 9 cannot communicate with client
>>>>version 4
>>>>         at
>>>>
>>>>org.apache.flink.runtime.fs.hdfs.HadoopFileSystem.initialize(HadoopFile
>>>>Sy
>>>>stem.java:337)
>>>>         at 
>>>>org.apache.flink.core.fs.FileSystem.get(FileSystem.java:253)
>>>>         at
>>>>
>>>>org.apache.flink.runtime.state.filesystem.FsStateBackend.<init>(FsState
>>>>Ba
>>>>ckend.java:142)
>>>>         at
>>>>
>>>>org.apache.flink.runtime.state.filesystem.FsStateBackend.<init>(FsState
>>>>Ba
>>>>ckend.java:101)
>>>>         at
>>>>
>>>>org.apache.flink.runtime.state.filesystem.FsStateBackendFactory.createF
>>>>ro
>>>>mConfig(FsStateBackendFactory.java:48)
>>>>         at
>>>>
>>>>org.apache.flink.streaming.runtime.tasks.StreamTask.createStateBackend(
>>>>St
>>>>reamTask.java:517)
>>>>         at
>>>>
>>>>org.apache.flink.streaming.runtime.tasks.StreamTask.registerInputOutput
>>>>(S
>>>>treamTask.java:171)
>>>>         at 
>>>>org.apache.flink.runtime.taskmanager.Task.run(Task.java:526)
>>>>         ... 1 more
>>>> Caused by:
>>>>
>>>>org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.RPC$Version
>>>>Mi
>>>>smatch):
>>>> Server IPC version 9 cannot communicate with client version 4
>>>>         at org.apache.hadoop.ipc.Client.call(Client.java:1113)
>>>>         at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)
>>>>         at com.sun.proxy.$Proxy6.getProtocolVersion(Unknown Source)
>>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>         at
>>>>
>>>>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.ja
>>>>va
>>>>:62)
>>>>         at
>>>>
>>>>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccesso
>>>>rI
>>>>mpl.java:43)
>>>>         at java.lang.reflect.Method.invoke(Method.java:497)
>>>>         at
>>>>
>>>>org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInv
>>>>oc
>>>>ationHandler.java:85)
>>>>         at
>>>>
>>>>org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocatio
>>>>nH
>>>>andler.java:62)
>>>>         at com.sun.proxy.$Proxy6.getProtocolVersion(Unknown Source)
>>>>         at org.apache.hadoop.ipc.RPC.checkVersion(RPC.java:422)
>>>>         at
>>>> org.apache.hadoop.hdfs.DFSClient.createNamenode(DFSClient.java:183)
>>>>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:281)
>>>>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:245)
>>>>         at
>>>>
>>>>org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFile
>>>>Sy
>>>>stem.java:100)
>>>>         at
>>>>
>>>>org.apache.flink.runtime.fs.hdfs.HadoopFileSystem.initialize(HadoopFile
>>>>Sy
>>>>stem.java:321)
>>>>         ... 8 more
>>>>
>>>> I searched for this error online and it indicates that the client
>>>>which
>>>>is
>>>> Flink in this case is at a much lower version. Is there a way to check
>>>>the
>>>> version of Hadoop packaged with my Flink installation?
>>>>
>>>> Thanks,
>>>> Ali
>>>>
>>

Reply via email to