On 8 Jun 2016, at 16:34, Daniel Haviv 
<daniel.ha...@veracity-group.com<mailto:daniel.ha...@veracity-group.com>> wrote:

Hi,
I'm trying to create a table on s3a but I keep hitting the following error:
Exception in thread "main" org.apache.hadoop.hive.ql.metadata.HiveException: 
MetaException(message:com.cloudera.com.amazonaws.AmazonClientException: Unable 
to load AWS credentials from any provider in the chain)



I tried setting the s3a keys using the configuration object but I might be 
hitting SPARK-11364<https://issues.apache.org/jira/browse/SPARK-11364> :

conf.set("fs.s3a.access.key", accessKey)
conf.set("fs.s3a.secret.key", secretKey)
conf.set("spark.hadoop.fs.s3a.access.key",accessKey)
conf.set("spark.hadoop.fs.s3a.secret.key",secretKey)

val sc = new SparkContext(conf)



I tried setting these propeties in hdfs-site.xml but i'm still getting this 
error.



try core-site.xml rather than hdfs-site.xml; the latter only gets loaded when 
an HdfsConfiguration() instances is created; it may be a bit too late.

Finally I tried to set the AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY environment 
variables but with no luck.




Those env vars aren't picked up directly by S3a (well, that was fixed over the 
weekend https://issues.apache.org/jira/browse/HADOOP-12807  ); There's some 
fixup in spark ( see SparkHadoopUtil.appendS3AndSparkHadoopConfigurations() ); 
I don't know if that is a factor;

Any ideas on how to resolve this issue ?



Thank you.
Daniel

Thank you.
Daniel

Reply via email to