Re: spark context not picking up default hadoop filesystem

2015-01-26 Thread Akhil Das
Ah i think for locally you should give the full hdfs URL. like val logs = sc.textFile("hdfs://akhldz:9000/sigmoid/logs") Thanks Best Regards On Mon, Jan 26, 2015 at 9:36 PM, Tamas Jambor wrote: > thanks for the reply. I have tried to add SPARK_CLASSPATH, I got a warning > that it was depreca

Re: spark context not picking up default hadoop filesystem

2015-01-26 Thread Tamas Jambor
thanks for the reply. I have tried to add SPARK_CLASSPATH, I got a warning that it was deprecated (didn't solve the problem), also tried to run with --driver-class-path, which did not work either. I am trying this locally. On Mon Jan 26 2015 at 15:04:03 Akhil Das wrote: > You can also trying a

Re: spark context not picking up default hadoop filesystem

2015-01-26 Thread Akhil Das
You can also trying adding the core-site.xml in the SPARK_CLASSPATH, btw are you running the application locally? or in standalone mode? Thanks Best Regards On Mon, Jan 26, 2015 at 7:37 PM, jamborta wrote: > hi all, > > I am trying to create a spark context programmatically, using > org.apache.