Try Following any one : *1. Set the access key and secret key in the sparkContext:* sparkContext.set("AWS_ACCESS_KEY_ID",yourAccessKey) sparkContext.set("AWS_SECRET_ACCESS_KEY",yourSecretKey)
*2. Set the access key and secret key in the environment before starting your application:* export AWS_ACCESS_KEY_ID=<your access> export AWS_SECRET_ACCESS_KEY=<your secret> * 3. Set the access key and secret key inside the hadoop configurations* val hadoopConf=sparkContext.hadoopConfiguration; hadoopConf.set("fs.s3.impl","org.apache.hadoop.fs.s3native.NativeS3FileSystem") hadoopConf.set("fs.s3.awsAccessKeyId",yourAccessKey) hadoopConf.set("fs.s3.awsSecretAccessKey",yourSecretKey) ----- Software Developer SigmoidAnalytics, Bangalore -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Access-to-s3-from-spark-tp20631p20654.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org