RE: Unable to Access files in Hadoop HA enabled from using Spark

2016-04-13 Thread ashesh_28
Are you running from eclipse ? If so add the *Hadoop_conf_dir* path to the classpath And then you can access your hdfs directory as below object sparkExample { def main(args: Array[String]){ val logname = "///user/hduser/input/sample.txt" val conf = new SparkConf().setAppName("SimpleA

RE: Unable to Access files in Hadoop HA enabled from using Spark

2016-04-12 Thread Amit Hora
ssage- From: "Amit Hora" Sent: ‎4/‎13/‎2016 11:41 AM To: "Jörn Franke" Cc: "user@spark.apache.org" Subject: RE: Unable to Access files in Hadoop HA enabled from using Spark There are DNS entries for both of my namenode Ambarimaster is standby and it resolves to ip per

RE: Unable to Access files in Hadoop HA enabled from using Spark

2016-04-12 Thread Amit Hora
uot; Sent: ‎4/‎13/‎2016 11:37 AM To: "Amit Singh Hora" Cc: "user@spark.apache.org" Subject: Re: Unable to Access files in Hadoop HA enabled from using Spark Is the host in /etc/hosts ? > On 13 Apr 2016, at 07:28, Amit Singh Hora wrote: > > I am trying to access di

Re: Unable to Access files in Hadoop HA enabled from using Spark

2016-04-12 Thread Jörn Franke
Is the host in /etc/hosts ? > On 13 Apr 2016, at 07:28, Amit Singh Hora wrote: > > I am trying to access directory in Hadoop from my Spark code on local > machine.Hadoop is HA enabled . > > val conf = new SparkConf().setAppName("LDA Sample").setMaster("local[2]") > val sc=new SparkContext(conf)

RE: Unable to Access files in Hadoop HA enabled from using Spark

2016-04-12 Thread Amit Singh Hora
This property already exists. -Original Message- From: "ashesh_28 [via Apache Spark User List]" Sent: ‎4/‎13/‎2016 11:02 AM To: "Amit Singh Hora" Subject: Re: Unable to Access files in Hadoop HA enabled from using Spark Try adding the following propert

Re: Unable to Access files in Hadoop HA enabled from using Spark

2016-04-12 Thread ashesh_28
Try adding the following property into hdfs-site.xml dfs.client.failover.proxy.provider. org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-Access-files-in-Had