I am running in an aws ec2 cluster that i launched using the spark-ec2
script that comes with spark
and I use the "-v master" option to run the head version.

If I then log into master and make changes to spark/conf/spark-defaults.conf
How do I make the changes take effect across the cluster?

Is just restarting spark-shell enough? (It does not seem to be)
Does  "~/spark/sbin/stop-all.sh ; sleep 5; ~/spark/sbin/start-all.sh" do it?
Do I need to copy the new spark-defaults.conf to all the slaves?
Or is there some command to sync everything?

thanks
Daniel

Reply via email to