Hi Kannan,

I believe you should be able to use the --jars for this when invoke the
spark-shell or perform a spark-submit.  Per docs:

--jars JARS            Comma-separated list of local jars to include on the
driver
                              and executor classpaths.

HTH.

-Todd

On Thu, Feb 26, 2015 at 5:43 PM, Kannan Rajah <kra...@maprtech.com> wrote:

> SparkConf.scala logs a warning saying SPARK_CLASSPATH is deprecated and we
> should use spark.executor.extraClassPath instead. But the online
> documentation states that spark.executor.extraClassPath is only meant for
> backward compatibility.
>
> https://spark.apache.org/docs/1.2.0/configuration.html#execution-behavior
>
> Which one is right? I have a use case to submit a hbase job from
> spark-shell and make it run using YARN. In this case, I need to somehow add
> the hbase jars to the classpath of the executor. If I add it to
> SPARK_CLASSPATH and export it it works fine. Alternatively, if I set
> the spark.executor.extraClassPath in spark-defaults.conf, it works fine.
> But the reason I don't like spark-defaults.conf is that I need to hard code
> it instead of relying on a script to generate the classpath. I can use a
> script in spark-env.sh and set SPARK_CLASSPATH.
>
> Given that compute-classpath uses SPARK_CLASSPATH variable, why is it
> marked as deprecated?
>
> --
> Kannan
>

Reply via email to