Seems like 3 possibilities:
1. Change the user flink runs as to the user with hdfs rights
2. hdfs chown the directory you're writing to (or hdfs chmod to open up
access)
3. I've seen where org.apache.hadoop.security.UserGroupInformation can be
used to do something like this:
UserGroup
I have the same question.
I am setting fs.hdfs.hadoopconf to the location of a Hadoop config. However,
when I start a job, I get an error message that it's trying to connect to
the HDFS directory as user "flink":
Caused by:
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessCo
Do you also set fs.hdfs.hadoopconf in flink-conf.yaml
(https://ci.apache.org/projects/flink/flink-docs-master/setup/config.html#common-options)?
On Thu, Aug 11, 2016 at 2:47 PM, Dong-iL, Kim wrote:
> Hi.
> In this case , I used standalone cluster(aws EC2) and I wanna connect to
> remote HDFS mach
Hi.
In this case , I used standalone cluster(aws EC2) and I wanna connect to remote
HDFS machine(aws EMR).
I register the location of core-site.xml as below.
does it need other properties?
fs.defaultFS
hdfs://…:8020
hadoop.security.authentication
si
Hi!
Do you register the Hadoop Config at the Flink Configuration?
Also, do you use Flink standalone or on Yarn?
Stephan
On Tue, Aug 9, 2016 at 11:00 AM, Dong-iL, Kim wrote:
> Hi.
> I’m trying to set external hdfs as state backend.
> my os user name is ec2-user. hdfs user is hadoop.
> there is
Hi.
I’m trying to set external hdfs as state backend.
my os user name is ec2-user. hdfs user is hadoop.
there is a permission denied exception.
I wanna specify hdfs user name.
I set hadoop.job.ugi in core-site.xml and HADOOP_USER_NAME on command line.
but not works.
what shall I do?
thanks.