SPARK_CLASSPATH is definitely deprecated, but my understanding is that
spark.executor.extraClassPath is not, so maybe the documentation needs
fixing.

I'll let someone who might know otherwise comment, though.

On Thu, Feb 26, 2015 at 2:43 PM, Kannan Rajah <kra...@maprtech.com> wrote:
> SparkConf.scala logs a warning saying SPARK_CLASSPATH is deprecated and we
> should use spark.executor.extraClassPath instead. But the online
> documentation states that spark.executor.extraClassPath is only meant for
> backward compatibility.
>
> https://spark.apache.org/docs/1.2.0/configuration.html#execution-behavior
>
> Which one is right? I have a use case to submit a hbase job from spark-shell
> and make it run using YARN. In this case, I need to somehow add the hbase
> jars to the classpath of the executor. If I add it to SPARK_CLASSPATH and
> export it it works fine. Alternatively, if I set the
> spark.executor.extraClassPath in spark-defaults.conf, it works fine. But the
> reason I don't like spark-defaults.conf is that I need to hard code it
> instead of relying on a script to generate the classpath. I can use a script
> in spark-env.sh and set SPARK_CLASSPATH.
>
> Given that compute-classpath uses SPARK_CLASSPATH variable, why is it marked
> as deprecated?
>
> --
> Kannan



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to