Finally I tried setting the configuration manually using
sc.hadoopconfiguration.set
    dfs.nameservices
    dfs.ha.namenodes.hdpha
    dfs.namenode.rpc-address.hdpha.n1 

And it worked ,don't know why it was not reading these settings from file under 
HADOOP_CONF_DIR

-----Original Message-----
From: "Amit Hora" <hora.a...@gmail.com>
Sent: ‎4/‎13/‎2016 11:41 AM
To: "Jörn Franke" <jornfra...@gmail.com>
Cc: "user@spark.apache.org" <user@spark.apache.org>
Subject: RE: Unable to Access files in Hadoop HA enabled from using Spark

There are DNS entries for both of my namenode
Ambarimaster is standby and it resolves to ip perfectly
Hdp231 is active and it also resolves to ip
Hdpha is my Hadoop HA cluster name
And hdfs-site.xml has entries related to these configuration


From: Jörn Franke
Sent: ‎4/‎13/‎2016 11:37 AM
To: Amit Singh Hora
Cc: user@spark.apache.org
Subject: Re: Unable to Access files in Hadoop HA enabled from using Spark


Is the host in /etc/hosts ?

> On 13 Apr 2016, at 07:28, Amit Singh Hora <hora.a...@gmail.com> wrote:
> 
> I am trying to access directory in Hadoop from my Spark code on local
> machine.Hadoop is HA enabled .
> 
> val conf = new SparkConf().setAppName("LDA Sample").setMaster("local[2]")
> val sc=new SparkContext(conf)
> val distFile = sc.textFile("hdfs://hdpha/mini_newsgroups/")
> println(distFile.count())
> but getting error
> 
> java.net.UnknownHostException: hdpha
> As hdpha not resolves to a particular machine it is the name I have chosen
> for my HA Hadoop.I have already copied all hadoop configuration on my local
> machine and have set the env. variable HADOOP_CONF_DIR But still no success.
> 
> Any suggestion will be of a great help
> 
> Note:- Hadoop HA is working properly as i have tried uploading file to
> hadoop and it works
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-Access-files-in-Hadoop-HA-enabled-from-using-Spark-tp26768.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 

Reply via email to