Re: Setting up spark to run on two nodes

2016-03-21 Thread Luciano Resende
There is also sbin/star-all.sh and sbin/stop-all.sh which enables you to star/stop master and workers all together On Sunday, March 20, 2016, Akhil Das wrote: > You can simply execute the sbin/start-slaves.sh file to start up all slave > processes. Just make sure you have spark installed on the

Re: Setting up spark to run on two nodes

2016-03-20 Thread Akhil Das
You can simply execute the sbin/start-slaves.sh file to start up all slave processes. Just make sure you have spark installed on the same path on all the machines. Thanks Best Regards On Sat, Mar 19, 2016 at 4:01 AM, Ashok Kumar wrote: > Experts. > > Please your valued advice. > > I have spark