Re: A number of issues when running spark-ec2

2016-04-16 Thread Ted Yu
Thanks Josh. I downloaded spark-1.6.1-bin-hadoop2.3.tgz and spark-1.6.1-bin-hadoop2.4.tgz which expand without error. On Sat, Apr 16, 2016 at 4:54 PM, Josh Rosen wrote: > Using a different machine / toolchain, I've downloaded and re-uploaded all > of the 1.6.1 artifacts to that S3 bucket, so ho

Re: A number of issues when running spark-ec2

2016-04-16 Thread Josh Rosen
Using a different machine / toolchain, I've downloaded and re-uploaded all of the 1.6.1 artifacts to that S3 bucket, so hopefully everything should be working now. Let me know if you still encounter any problems with unarchiving. On Sat, Apr 16, 2016 at 3:10 PM Ted Yu wrote: > Pardon me - there

Re: A number of issues when running spark-ec2

2016-04-16 Thread Ted Yu
Pardon me - there is no tarball for hadoop 2.7 I downloaded https://s3.amazonaws.com/spark-related-packages/spark-1.6.1-bin-hadoop2.6.tgz and successfully expanded it. FYI On Sat, Apr 16, 2016 at 2:52 PM, Jon Gregg wrote: > That link points to hadoop2.6.tgz. I tried changing the URL to > http

Re: A number of issues when running spark-ec2

2016-04-16 Thread Jon Gregg
That link points to hadoop2.6.tgz. I tried changing the URL to https://s3.amazonaws.com/spark-related-packages/spark-1.6.1-bin-hadoop2.7.tgz and I get a NoSuchKey error. Should I just go with it even though it says hadoop2.6? On Sat, Apr 16, 2016 at 5:37 PM, Ted Yu wrote: > BTW this was the or

Re: A number of issues when running spark-ec2

2016-04-16 Thread Ted Yu
BTW this was the original thread: http://search-hadoop.com/m/q3RTt0Oxul0W6Ak The link for spark-1.6.1-bin-hadoop2.7 is https://s3.amazonaws.com/spark-related-packages/spark-1.6.1-bin-hadoop2.7.tgz On Sat, Apr 16, 201

Re: A number of issues when running spark-ec2

2016-04-16 Thread Ted Yu
>From the output you posted: --- Unpacking Spark gzip: stdin: not in gzip format tar: Child returned status 1 tar: Error is not recoverable: exiting now --- The artifact for spark-1.6.1-bin-hadoop2.6 is corrupt. This problem has been reported in other threads. Try spark-1.6.1-bin-hadoop