Is the host in /etc/hosts ?

> On 13 Apr 2016, at 07:28, Amit Singh Hora <hora.a...@gmail.com> wrote:
> 
> I am trying to access directory in Hadoop from my Spark code on local
> machine.Hadoop is HA enabled .
> 
> val conf = new SparkConf().setAppName("LDA Sample").setMaster("local[2]")
> val sc=new SparkContext(conf)
> val distFile = sc.textFile("hdfs://hdpha/mini_newsgroups/")
> println(distFile.count())
> but getting error
> 
> java.net.UnknownHostException: hdpha
> As hdpha not resolves to a particular machine it is the name I have chosen
> for my HA Hadoop.I have already copied all hadoop configuration on my local
> machine and have set the env. variable HADOOP_CONF_DIR But still no success.
> 
> Any suggestion will be of a great help
> 
> Note:- Hadoop HA is working properly as i have tried uploading file to
> hadoop and it works
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-Access-files-in-Hadoop-HA-enabled-from-using-Spark-tp26768.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to