[ 
https://issues.apache.org/jira/browse/SPARK-16363?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15361798#comment-15361798
 ] 

Ashic Mahtab commented on SPARK-16363:
--------------------------------------

I just tried setting the two exports...still no joy. They likely would need 
setting on the worker node that houses the driver. Anyhow, the issue is more 
around it automatically picking these up.

> Spark-submit doesn't work with IAM Roles
> ----------------------------------------
>
>                 Key: SPARK-16363
>                 URL: https://issues.apache.org/jira/browse/SPARK-16363
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>    Affects Versions: 1.6.2
>         Environment: Spark Stand-Alone with EC2 instances configured with IAM 
> Roles. 
>            Reporter: Ashic Mahtab
>
> When running Spark Stand-alone in EC2 boxes, 
> spark-submit --master spark://master-ip:7077 --class Foo 
> --deploy-mode cluster --verbose s3://bucket/dir/foo/jar
> fails to find the jar even if AWS IAM roles are configured to allow the EC2 
> boxes (that are running Spark master, and workers) access to the file in S3. 
> The exception is provided below. It's asking us to set keys, etc. when the 
> boxes are configured via IAM roles. 
> 16/07/04 11:44:09 ERROR ClientEndpoint: Exception from cluster was: 
> java.lang.IllegalArgumentException: AWS Access Key ID and Secret Access Key 
> must be specified as the username or password (respectively) of a s3 URL, or 
> by setting the fs.s3.awsAccessKeyId or fs.s3.awsSecretAccessKey properties 
> (respectively).
> java.lang.IllegalArgumentException: AWS Access Key ID and Secret Access Key 
> must be specified as the username or password (respectively) of a s3 URL, or 
> by setting the fs.s3.awsAccessKeyId or fs.s3.awsSecretAccessKey properties 
> (respectively).
>         at 
> org.apache.hadoop.fs.s3.S3Credentials.initialize(S3Credentials.java:66)
>         at 
> org.apache.hadoop.fs.s3.Jets3tFileSystemStore.initialize(Jets3tFileSystemStore.java:82)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:498)
>         at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85)
>         at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62)
>         at com.sun.proxy.$Proxy5.initialize(Unknown Source)
>         at 
> org.apache.hadoop.fs.s3.S3FileSystem.initialize(S3FileSystem.java:77)
>         at 
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1446)
>         at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
>         at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1464)
>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:263)
>         at org.apache.spark.util.Utils$.getHadoopFileSystem(Utils.scala:1686)
>         at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:598)
>         at org.apache.spark.util.Utils$.fetchFile(Utils.scala:395)
>         at 
> org.apache.spark.deploy.worker.DriverRunner.org$apache$spark$deploy$worker$DriverRunner$$downloadUserJar(DriverRunner.scala:150)
>         at 
> org.apache.spark.deploy.worker.DriverRunner$$anon$1.run(DriverRunner.scala:79)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to