You can also trying adding the core-site.xml in the SPARK_CLASSPATH, btw
are you running the application locally? or in standalone mode?

Thanks
Best Regards

On Mon, Jan 26, 2015 at 7:37 PM, jamborta <jambo...@gmail.com> wrote:

> hi all,
>
> I am trying to create a spark context programmatically, using
> org.apache.spark.deploy.SparkSubmit. It all looks OK, except that the
> hadoop
> config that is created during the process is not picking up core-site.xml,
> so it defaults back to the local file-system. I have set HADOOP_CONF_DIR in
> spark-env.sh, also core-site.xml in in the conf folder. The whole thing
> works if it is executed through spark shell.
>
> Just wondering where spark is picking up the hadoop config path from?
>
> many thanks,
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/spark-context-not-picking-up-default-hadoop-filesystem-tp21368.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to