Ah i think for locally you should give the full hdfs URL. like

val logs = sc.textFile("hdfs://akhldz:9000/sigmoid/logs")



Thanks
Best Regards

On Mon, Jan 26, 2015 at 9:36 PM, Tamas Jambor <jambo...@gmail.com> wrote:

> thanks for the reply. I have tried to add SPARK_CLASSPATH, I got a warning
> that it was deprecated (didn't solve the problem), also tried to run with
> --driver-class-path, which did not work either. I am trying this locally.
>
>
>
>
> On Mon Jan 26 2015 at 15:04:03 Akhil Das <ak...@sigmoidanalytics.com>
> wrote:
>
>> You can also trying adding the core-site.xml in the SPARK_CLASSPATH, btw
>> are you running the application locally? or in standalone mode?
>>
>> Thanks
>> Best Regards
>>
>> On Mon, Jan 26, 2015 at 7:37 PM, jamborta <jambo...@gmail.com> wrote:
>>
>>> hi all,
>>>
>>> I am trying to create a spark context programmatically, using
>>> org.apache.spark.deploy.SparkSubmit. It all looks OK, except that the
>>> hadoop
>>> config that is created during the process is not picking up
>>> core-site.xml,
>>> so it defaults back to the local file-system. I have set HADOOP_CONF_DIR
>>> in
>>> spark-env.sh, also core-site.xml in in the conf folder. The whole thing
>>> works if it is executed through spark shell.
>>>
>>> Just wondering where spark is picking up the hadoop config path from?
>>>
>>> many thanks,
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/spark-context-not-picking-up-default-hadoop-filesystem-tp21368.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>

Reply via email to