Hi Stephen,

Have you tried the --jars option (with jars separated by commas)?  It
should make the given jars available both to the driver and the executors.
 I believe one caveat currently is that if you give it a folder it won't
pick up all the jars inside.

-Sandy


On Fri, Aug 15, 2014 at 4:07 PM, Stephen Boesch <java...@gmail.com> wrote:

> Although this has been discussed a number of times here, I am still unclear
> how to add user jars to the spark-shell:
>
> a) for importing classes for use directly within the shell interpreter
>
> b) for  invoking SparkContext commands with closures referencing user
> supplied classes contained within jar's.
>
> Similarly to other posts, I have gone through:
>
>  updating bin/spark-env.sh
>  SPARK_CLASSPATH
>  SPARK_SUBMIT_OPTS
>   creating conf/spark-defaults.conf  and adding
>  spark.executor.extraClassPath
> --driver-class-path
>   etc
>
> Hopefully there would be something along the lines of  a single entry added
> to some claspath somewhere like this
>
>    SPARK_CLASSPATH/driver-class-path/spark.executor.extraClassPath (or
> whatever is the correct option..)  =
> $HBASE_HOME/*:$HBASE_HOME/lib/*:$SPARK_CLASSPATH
>
> Any ideas here?
>
> thanks
>

Reply via email to