- Remove localhost from the conf/slaves file, add the slaves private ip.
- Make sure master and slave machines are on the same security group (this
way all ports will be accessible to all machines)
- In conf/spark-env.sh file, place export
SPARK_MASTER_IP=MASTER-NODES-PUBLIC-OR-PRIVATE-IP and remove SPARK_LOCAL_IP

These changes should get you started with spark cluster. If not, look in the
logs file for more detailed information.


bjameshunter wrote
> Hi,
> 
> I've tried a half a dozen times to build a spark cluster on EC2, without
> using the ec2 scripts or EMR. I'd like to eventually get an IPython
> notebook server running on the master, and the ec2 scripts and EMR don't
> seem accommodating for that. 
> 
> I build an Ubuntu-spark-ipython machine. 
> I setup the Ipython server. Ipython and spark work together. 
> I make an image of the machine, spin up two of them. I can ssh between the
> original (master) and two new slaves without password (put master
> id_rsa.pub on slaves). 
> I add slave public IPs to $SPARK_HOME/conf/slaves, underneath "localhost"
> I execute $SPARK_HOME/sbin/start-all.sh
> Slaves and master start. The master GUI shows only one slave - itself.
> ---
> Here's where, I think the documentation ends for me and I start trying
> random stuff.
> setting SPARK_MASTER_IP to the EC2 public IP on all machines. Setting
> SPARK_LOCAL_IP to 127.0.01.
> Changing hostname on master to the public IP, changing it to my domain
> name prefixed with spark-master and routing it through my digital ocean
> account, etc.
> 
> All combinations of the above steps have been tried, and then some.
> 
> Any clue what I don't understand here?
> 
> Thanks,
> Ben
> 
> https://spark.apache.org/docs/latest/spark-standalone.html#cluster-launch-scripts





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Not-understanding-manually-building-EC2-cluster-tp23194p23195.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to