Hi @KhajaAsmath Mohammed<mailto:mdkhajaasm...@gmail.com>,
Try these conf sets and see if you can access the s3 bucket. spark.sparkContext.hadoopConfiguration.set("fs.s3a.access.key", aws.access_id) spark.sparkContext.hadoopConfiguration.set("fs.s3a.impl", "org.apache.hadoop.fs.s3a.S3AFileSystem") spark.sparkContext.hadoopConfiguration.set("fs.s3a.secret.key", aws.secret_access_key) Thanks, Badri ________________________________ From: KhajaAsmath Mohammed <mdkhajaasm...@gmail.com> Sent: 19 May 2021 04:41 To: user @spark <user@spark.apache.org> Subject: S3 Access Issues - Spark Hi, I have written a sample spark job that reads the data residing in hbase. I keep getting below error , any suggestions to resolve this please? Caused by: java.lang.IllegalArgumentException: AWS Access Key ID and Secret Access Key must be specified by setting the fs.s3.awsAccessKeyId and fs.s3.awsSecretAccessKey properties (respectively). at org.apache.hadoop.fs.s3.S3Credentials.initialize(S3Credentials.java:74) conf.set("fs.s3.impl", "org.apache.hadoop.fs.s3.S3FileSystem") conf.set("fs.s3.awsAccessKeyId", "ddd") conf.set("fs.s3.awsSecretAccessKey", "dddddd") conf.set("fs.s3n.impl", "org.apache.hadoop.fs.s3native.NativeS3FileSystem") conf.set("fs.s3n.awsAccessKeyId", "xxxxxxx") conf.set("fs.s3n.awsSecretAccessKey", "xxxx") I tried this setting in spark config and hbase config but none of the resolved my issue. Thanks, Asmath