Hm, it should just take effect immediately. But yes, there is a script for syncing everything:
/root/spark-ec2/copy-dir --delete /root/spark After that you should do /root/spark/sbin/stop-all.sh /root/spark/sbin/start-all.sh 2014-05-18 16:56 GMT-07:00 Daniel Mahler <dmah...@gmail.com>: > > I am running in an aws ec2 cluster that i launched using the spark-ec2 > script that comes with spark > and I use the "-v master" option to run the head version. > > If I then log into master and make changes > to spark/conf/spark-defaults.conf > How do I make the changes take effect across the cluster? > > Is just restarting spark-shell enough? (It does not seem to be) > Does "~/spark/sbin/stop-all.sh ; sleep 5; ~/spark/sbin/start-all.sh" do > it? > Do I need to copy the new spark-defaults.conf to all the slaves? > Or is there some command to sync everything? > > thanks > Daniel >