Santhosh, All the nodes should have access to the jar file and so the classpath should be on all the nodes. Usually it is better to rsync the conf directory to all nodes rather than editing them separately. Cheers <k/>
On Tue, Jun 17, 2014 at 9:26 PM, santhoma <santhosh.tho...@yahoo.com> wrote: > Hi, > > This is about spark 0.9. > I have a 3 node spark cluster. I want to add a locally available jarfile > (present on all nodes) to the SPARK_CLASPATH variable in > /etc/spark/conf/spark-env.sh so that all nodes can access it. > > Question is, > should I edit 'spark-env.sh' on all nodes to add the jar ? > Or, is it enough to add it only in the master node from where I am > submitting jobs? > > thanks > Santhosh > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/question-about-setting-SPARK-CLASSPATH-IN-spark-env-sh-tp7809.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. >