Are you running from eclipse ?
If so add the *Hadoop_conf_dir* path to the classpath
And then you can access your hdfs directory as below
object sparkExample {
def main(args: Array[String]){
val logname = "///user/hduser/input/sample.txt"
val conf = new
SparkConf().setAppName("SimpleA
ssage-
From: "Amit Hora"
Sent: 4/13/2016 11:41 AM
To: "Jörn Franke"
Cc: "user@spark.apache.org"
Subject: RE: Unable to Access files in Hadoop HA enabled from using Spark
There are DNS entries for both of my namenode
Ambarimaster is standby and it resolves to ip per
uot;
Sent: 4/13/2016 11:37 AM
To: "Amit Singh Hora"
Cc: "user@spark.apache.org"
Subject: Re: Unable to Access files in Hadoop HA enabled from using Spark
Is the host in /etc/hosts ?
> On 13 Apr 2016, at 07:28, Amit Singh Hora wrote:
>
> I am trying to access di
Is the host in /etc/hosts ?
> On 13 Apr 2016, at 07:28, Amit Singh Hora wrote:
>
> I am trying to access directory in Hadoop from my Spark code on local
> machine.Hadoop is HA enabled .
>
> val conf = new SparkConf().setAppName("LDA Sample").setMaster("local[2]")
> val sc=new SparkContext(conf)
This property already exists.
-Original Message-
From: "ashesh_28 [via Apache Spark User List]"
Sent: 4/13/2016 11:02 AM
To: "Amit Singh Hora"
Subject: Re: Unable to Access files in Hadoop HA enabled from using Spark
Try adding the following propert
Try adding the following property into hdfs-site.xml
dfs.client.failover.proxy.provider.
org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-Access-files-in-Had