by the way, any idea how to sync the spark config dir with other nodes in the
cluster?
~santhosh
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/question-about-setting-SPARK-CLASSPATH-IN-spark-env-sh-tp7809p7853.html
Sent from the Apache Spark User List mai
Thanks, I hope this problem will go away once I upgrade to spark 1.0 where we
can send the clusterwide classpaths using spark-submit command
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/question-about-setting-SPARK-CLASSPATH-IN-spark-env-sh-tp7809p7822.ht
Santhosh,
All the nodes should have access to the jar file and so the classpath
should be on all the nodes. Usually it is better to rsync the conf
directory to all nodes rather than editing them separately.
Cheers
On Tue, Jun 17, 2014 at 9:26 PM, santhoma wrote:
> Hi,
>
> This is about spar
Hi,
This is about spark 0.9.
I have a 3 node spark cluster. I want to add a locally available jarfile
(present on all nodes) to the SPARK_CLASPATH variable in
/etc/spark/conf/spark-env.sh so that all nodes can access it.
Question is,
should I edit 'spark-env.sh' on all nodes to add the jar ?