I am going to say no, but have not actually tested this. Just going on this line in the docs:

http://spark.apache.org/docs/latest/configuration.html

|spark.driver.extraClassPath| (none) Extra classpath entries to prepend to the classpath of the driver. /Note:/ In client mode, this config must not be set through the |SparkConf| directly in your application, because the driver JVM has already started at that point. Instead, please set this through the |--driver-class-path| command line option or in your default properties file.




On 12/17/2015 07:53 AM, amarouni wrote:
Hello guys,

Do you know if the method SparkContext.addJar("file:///...") can be used
on a running context (an already started spark-shell) ?
And if so, does it add the jar to the class-path of the Spark workers
(Yarn containers in case of yarn-client) ?

Thanks,

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org


Reply via email to