This may occurred while the ec2 instance are not ready and ssh port not open 
yet.

Please give larger time by specify -w 300.  Default should be 120

Thanks,
Tracy
Sent from my iPhone

> On 2014年7月29日, at 上午8:17, sparking <research...@gmail.com> wrote:
> 
> I'm trying to launch Spark with this command on AWS:
> *./spark-ec2 -k keypair_name -i keypair.pem -s 5 -t c1.xlarge -r us-west-2
> --hadoop-major-version=2.4.0 launch spark_cluster*
> 
> This script is erroring out with this message:
> *ssh: connect to host <hostname> port 22: Connection refused
> Error executing remote command, retrying after 30 seconds*: Command '['ssh',
> '-o', 'StrictHostKeyChecking=no', '-i', 'keypair.pem', '-t', '-t',
> u'root@<hostname>', "\n      [ -f ~/.ssh/id_rsa ] ||\n        (ssh-keygen -q
> -t rsa -N '' -f ~/.ssh/id_rsa &&\n         cat ~/.ssh/id_rsa.pub >>
> ~/.ssh/authorized_keys)\n    "]' returned non-zero exit status 255
> 
> Strange this is, I can manually ssh to master node as "root" using this
> command:
> *ssh root@<hostname> -i keypair.pem*
> 
> Does anyone know what is going on here? Any help is appreciated.
> 
> 
> 
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/ssh-connection-refused-tp10818.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to