Just tried this and it worked fine for me:

./bin/spark-shell --jars jar1,jar2,etc,etc

On Wed, Jun 11, 2014 at 10:25 AM, Ulanov, Alexander
<alexander.ula...@hp.com> wrote:
> Hi,
>
>
>
> I am currently using spark 1.0 locally on Windows 7. I would like to use
> classes from external jar in the spark-shell. I followed the instruction in:
> http://mail-archives.apache.org/mod_mbox/spark-user/201402.mbox/%3CCALrNVjWWF6k=c_jrhoe9w_qaacjld4+kbduhfv0pitr8h1f...@mail.gmail.com%3E
>
>
>
> I have set ADD_JARS=”my.jar” SPARK_CLASSPATH=”my.jar” in spark-shell.cmd but
> this didn’t work.
>
>
>
> I also tried running “spark-shell.cmd --jars my.jar --driver-class-path
> my.jar --driver-library-path my.jar” and it didn’t work either.
>
>
>
> I cannot load any class from my jar into spark shell. Btw my.jar contains a
> simple Scala class.
>
>
>
> Best regards, Alexander



-- 
Marcelo

Reply via email to