Hi:

I am using Spark 1.1, and want to add an external jars to spark-shell. I
dig around, and found others are doing it in two ways.

*Method 1*

bin/spark-shell --jars "<path-to-jars>"  --master ...

*Method 2*

ADD_JARS=<path-to-jars> SPARK_CLASSPATH=<path-to-jars>  bin/spark-shell
--master ...

What is the difference between these two methods ? In my case, the method 1
does not work, while the method 2 works.

Thanks.

Chuang

Reply via email to