a good place to start
debugging from
full list here: https://hortonworks.github.io/hdp-aws/s3-configure/index.html
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/s3-bucket-access-read-file-tp23536p28688.html
Sent from the Apache Spark User List mailing
How did you solve the problem with V4?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/s3-bucket-access-read-file-tp23536p28688.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
I think 2.6 failed to abruptly close streams that weren't fully read, which
we observed as a huge performance hit. We had to backport the 2.7
improvements before being able to use it.
et")
Try setting them to s3n as opposed to just s3
Good luck!
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/s3-bucket-access-read-file-tp23536p23560.html
Sent from the Apache Spark User List mailing list archive at
Nabble.com<http://Nabble.com>.
doopConf.set("fs.s3n.awsAccessKeyId","key")
> hadoopConf.set("fs.s3n.awsSecretAccessKey","secret")
>
> Try setting them to s3n as opposed to just s3
>
> Good luck!
>
>
>
> --
> View this message in context:
> http://apache-spark-user-lis
3
Good luck!
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/s3-bucket-access-read-file-tp23536p23560.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---
.nabble.com/s3-bucket-access-read-file-tp23536p23544.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h
n:
> org.jets3t.service.S3ServiceException: S3 HEAD request failed for
> '/user%2Fdidi' - ResponseCode=400, ResponseMessage=Bad Request
>
> what does the user has to do here??? i am using key & secret !!!
>
> How can i simply create RDD from text file on S3
>
.jets3t.service.S3ServiceException: S3 HEAD request failed for
'/user%2Fdidi' - ResponseCode=400, ResponseMessage=Bad Request
what does the user has to do here??? i am using key & secret !!!
How can i simply create RDD from text file on S3
Thanks
Didi
--
View this message in
nseCode=400, ResponseMessage=Bad Request
what does the user has to do here??? i am using key & secret !!!
How can i simply create RDD from text file on S3
Thanks
Didi
--
View this message in context:
http://apache-spark-user
your application will now either use an SDK to
>> access the resources using IAM roles, or call the EC2 instance metadata
>> to
>> obtain the temporary credentials.
>> --
>>
>> Maybe you can use AWS SDK in your application to provide AWS credentials?
>>
>&
ccessKey)
>>> >>>
>>> >>> hadoopConf.set("fs.s3.awsSecretAccessKey",yourSecretKey)
>>> >>>
>>> >>>
>>> >> 4. You can also try:
>>> >>
>>> >> val lines =
>>> >>
&
> wrote:
> >>
> >>> Hi
> >>>
> >>> I am trying to access files/buckets in S3 and encountering a permissions
> >>> issue. The buckets are configured to authenticate using an IAMRole
> >>> provider.
> >>> I have set the
also try:
>> >>
>> >> val lines =
>> >>
>> >> s
>> >>> parkContext.textFile("s3n://yourAccessKey:yourSecretKey@
>> >>>
>> >
>> > /path/")
>> >>
>> >>
>> >>
ote:
> >>
> >>> Hi
> >>>
> >>> I am trying to access files/buckets in S3 and encountering a
> permissions
> >>> issue. The buckets are configured to authenticate using an IAMRole
> >>> provider.
> >>> I have set the KeyId and Secret using envi
> to access the S3 buckets.
>>>
>>> Before setting the access key and secret the error was:
>>> "java.lang.IllegalArgumentException:
>>> AWS Access Key ID and Secret Access Key must be specified as the
>>> username
>>> or password (res
Thanks for the pointers.
I verified that the access key-id/secret used are valid. However, the
secret may contain "/" at times. The issues I am facing are as follows:
- The EC2 instances are setup with an IAMRole () and don't have a static
key-id/secret
- All of the EC2 instances have acc
Hi,
keep in mind that you're going to have a bad time if your secret key
contains a "/"
This is due to old and stupid hadoop bug:
https://issues.apache.org/jira/browse/HADOOP-3733
Best way is to regenerate the key so it does not include a "/"
/Raf
Akhil Das wrote:
> Try the following:
>
> 1. Set
Try the following:
1. Set the access key and secret key in the sparkContext:
sparkContext.set("
>
> AWS_ACCESS_KEY_ID",yourAccessKey)
sparkContext.set("
>
> AWS_SECRET_ACCESS_KEY",yourSecretKey)
2. Set the access key and secret key in the environment before starting
your application:
>
.n3.nabble.com/S3-Bucket-Access-tp16303p16366.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h
There is detailed information available in the official documentation[1].
If you don't have a key pair, you can generate one as described in AWS
documentation [2]. That should be enough to get started.
[1] http://spark.apache.org/docs/latest/ec2-scripts.html
[2] http://docs.aws.amazon.com/AWSEC2/l
Hi Daniil
Could you provide some more details on how the cluster should be
launched/configured? The EC2 instance that I am dealing with uses the
concept of IAMRoles. I don't have any "keyfile" to specify to the spark-ec2
script.
Thanks for your help.
- Ranga
On Mon, Oct 13, 2014 at 3:04 PM, Dan
(Copying the user list)
You should use spark_ec2 script to configure the cluster. If you use trunk
version you can use the new --copy-aws-credentials option to configure the
S3 parameters automatically, otherwise either include them in your
SparkConf variable or add them to
/root/spark/ephemeral-hd
Is there a way to specify a request header during the
.textFile call?
- Ranga
On Mon, Oct 13, 2014 at 11:03 AM, Ranga wrote:
> Hi
>
> I am trying to access files/buckets in S3 and encountering a permissions
> issue. The buckets are configured to authenticate using an IAMRole provider.
> I have
Hi
I am trying to access files/buckets in S3 and encountering a permissions
issue. The buckets are configured to authenticate using an IAMRole provider.
I have set the KeyId and Secret using environment variables (
AWS_SECRET_ACCESS_KEY and AWS_ACCESS_KEY_ID). However, I am still unable to
access
25 matches
Mail list logo