Try using --jars instead of the driver-only options; they should work with 
spark-shell too but they may be less tested.

Unfortunately, you do have to specify each JAR separately; you can maybe use a 
shell script to list a directory and get a big list, or set up a project that 
builds all of the dependencies into one assembly JAR.

Matei

> On Oct 30, 2014, at 5:24 PM, Shay Seng <s...@urbanengines.com> wrote:
> 
> Hi,
> 
> I've been trying to move up from spark 0.9.2 to 1.1.0. 
> I'm getting a little confused with the setup for a few different use cases, 
> grateful for any pointers...
> 
> (1) spark-shell + with jars that are only required by the driver
> (1a) 
> I added "spark.driver.extraClassPath  /mypath/to.jar" to my 
> spark-defaults.conf
> I launched spark-shell with:  ./spark-shell
> 
> Here I see on the WebUI that spark.driver.extraClassPath has been set, but I 
> am NOT able to access any methods in the jar.
> 
> (1b)
> I removed "spark.driver.extraClassPath" from my spark-default.conf
> I launched spark-shell with  .//spark-shell --driver.class.path /mypath/to.jar
> 
> Again I see that the WebUI spark.driver.extraClassPath has been set. 
> But this time I am able to access the methods in the jar. 
> 
> Q: Is spark-shell not considered the driver in this case?  why does using 
> --driver.class.path on the command line have a different behavior to setting 
> it in spark-defaults.conf ?
>  
> 
> (2) Rather than adding each jar individually, is there a way to use 
> wildcards? Previously with SPARK_CLASS_PATH I was able to use <mypath>/*  but 
> with --driver.class.path it seems to require individual files.
> 
> tks
> Shay


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to