This property already exists.

-----Original Message-----
From: "ashesh_28 [via Apache Spark User List]" 
<ml-node+s1001560n26769...@n3.nabble.com>
Sent: ‎4/‎13/‎2016 11:02 AM
To: "Amit Singh Hora" <hora.a...@gmail.com>
Subject: Re: Unable to Access files in Hadoop HA enabled from using Spark

Try adding the following property into hdfs-site.xml 

<property>
       <name>dfs.client.failover.proxy.provider.<Your Cluster Name></name>
       
<value>org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider</value>
</property> 




If you reply to this email, your message will be added to the discussion below:
http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-Access-files-in-Hadoop-HA-enabled-from-using-Spark-tp26768p26769.html
 
To unsubscribe from Unable to Access files in Hadoop HA enabled from using 
Spark, click here.
NAML 



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-Access-files-in-Hadoop-HA-enabled-from-using-Spark-tp26768p26770.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to