hdfs://ServerURI:8020/user/cloudera/inputs should do the trick
​

On Mon, Jun 8, 2015 at 12:41 PM Pa Rö <paul.roewer1...@googlemail.com>
wrote:

> it's works, now i have set the permissiions to the yarn user,
> but my flink app not find the path. i try following path and get the same
> exception:
> file:///127.0.0.1:8020/user/cloudera/inputs/
>
> how i must set the path to hdfs??
>
>
> 2015-06-08 11:38 GMT+02:00 Till Rohrmann <till.rohrm...@gmail.com>:
>
>> I assume that the path inputs and outputs is not correct since you get
>> the error message *chown `output’: No such file or directory*. Try to
>> provide the full path to the chown command such as
>> hdfs://ServerURI/path/to/your/directory.
>> ​
>>
>> On Mon, Jun 8, 2015 at 11:23 AM Pa Rö <paul.roewer1...@googlemail.com>
>> wrote:
>>
>>> Hi Robert,
>>>
>>> i have see that you write me on stackoverflow, thanks. now the path is
>>> right and i get the old exception:
>>> org.apache.flink.runtime.JobException: Creating the input splits caused
>>> an error: File file:/127.0.0.1:8020/home/user/cloudera/outputs/seed-1
>>> does not exist or the user running Flink ('yarn') has insufficient
>>> permissions to access it.
>>>
>>> i have look at the hdfs and want give the user yarn all permissions:
>>> [cloudera@quickstart bin]$ hdfs dfs -ls
>>> Found 9 items
>>> drwxrwxrwt   - cloudera cloudera          0 2015-06-03 04:24 .Trash
>>> drwxrwxrwt   - cloudera cloudera          0 2015-06-08 01:17 .flink
>>> drwxrwxrwt   - cloudera cloudera          0 2015-06-04 06:51 .staging
>>> drwxrwxrwt   - cloudera cloudera          0 2015-02-17 08:33 gdelt
>>> drwxrwxrwt   - cloudera cloudera          0 2015-06-02 06:42 inputs
>>> -rwxrwxrwt   1 cloudera cloudera   31223141 2015-06-03 03:53
>>> ma-mahout.jar
>>> -rwxrwxrwt   1 cloudera cloudera   30037418 2015-06-03 03:53
>>> ma-mapreduce.jar
>>> drwxrwxrwt   - cloudera cloudera          0 2015-06-04 07:38 oozie-oozi
>>> drwxrwxrwt   - cloudera cloudera          0 2015-06-03 03:59 outputs
>>> [cloudera@quickstart bin]$ sudo hdfs dfs -chown -R yarn:hadoop inputs
>>> chown: `inputs': No such file or directory
>>> [cloudera@quickstart bin]$ sudo hdfs dfs -chown -R yarn:hadoop outputs
>>> chown: `outputs': No such file or directory
>>>
>>> helpful:
>>> https://hadoop.apache.org/docs/r2.4.1/hadoop-project-dist/hadoop-common/FileSystemShell.html
>>>
>>> something i do wrong, maybe you have a idea?
>>>
>>>
>

Reply via email to