1. Upto you, you can either add internal ip or the external ip, it won't be
a problem unless they are not in the same network.
2. If you only want to start a particular slave, then you can do like:
sbin/start-slave.sh
Thanks
Best Regards
On Thu, May 28, 2015 at 1:52 PM, Nizan Grauer wrote:
hi,
thanks for your answer!
I have few more:
1) the file /root/spark/conf/slaves , has the full DNS names of servers (
ec2-52-26-7-137.us-west-2.compute.amazonaws.com), did you add there the
internal ip?
2) You call to start-all. Isn't it too aggressive? Let's say I have 20
slaves up, and I want
I do this way:
- Launch a new instance by clicking on the slave instance and choose *launch
more like this *
*- *Once its launched, ssh into it and add the master public key to
.ssh/authorized_keys
- Add the slaves internal IP to the master's conf/slaves file
- do sbin/start-all.sh and it will sho