Hi Pierre,
One way is to recreate your credentials until AWS generates one without a
slash character in it. Another way I've been using is to pass these
credentials outside the S3 file path by setting the following (where sc is
the SparkContext).
sc._jsc.hadoopConfiguration().set("fs.s3n.awsA
> On 5 Jun 2015, at 08:03, Pierre B
> wrote:
>
> Hi list!
>
> My problem is quite simple.
> I need to access several S3 buckets, using different credentials.:
> ```
> val c1 =
> sc.textFile("s3n://[ACCESS_KEY_ID1:SECRET_ACCESS_KEY1]@bucket1/file.csv").count
> val c2 =
> sc.textFile("s3n://[ACC
e help me on this?
Thanks
Pierre
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Access-several-s3-buckets-with-credentials-containing-tp23172.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
--
e help me on this?
Thanks
Pierre
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Access-several-s3-buckets-with-credentials-containing-tp23171.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
--