Hi all,

I'm trying to launch a cluster with the spark-ec2 script but seeing the
error below.  Are the packages on S3 corrupted / not in the correct format?

Initializing spark

--2016-04-13 00:25:39--
http://s3.amazonaws.com/spark-related-packages/spark-1.6.1-bin-hadoop1.tgz

Resolving s3.amazonaws.com (s3.amazonaws.com)... 54.231.11.67

Connecting to s3.amazonaws.com (s3.amazonaws.com)|54.231.11.67|:80...
connected.

HTTP request sent, awaiting response... 200 OK

Length: 277258240 (264M) [application/x-compressed]

Saving to: ‘spark-1.6.1-bin-hadoop1.tgz’

100%[==================================================================================================================>]
277,258,240 37.6MB/s   in 9.2s

2016-04-13 00:25:49 (28.8 MB/s) - ‘spark-1.6.1-bin-hadoop1.tgz’ saved
[277258240/277258240]

Unpacking Spark


gzip: stdin: not in gzip format

tar: Child returned status 1

tar: Error is not recoverable: exiting now

mv: missing destination file operand after `spark'

Try `mv --help' for more information.






-- 
[image: Branch] <https://branch.io/?bmp=xink-sig>
Augustus Hong
Software Engineer

Reply via email to