[ 
https://issues.apache.org/jira/browse/SQOOP-3385?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16621797#comment-16621797
 ] 

Boglarka Egyed commented on SQOOP-3385:
---------------------------------------

Sqoop uses Hadoop 2.8.0 on trunk. Please also note that this feature is not 
released in Sqoop yet, it is only on the latest trunk version, it will be 
included in the next release (under version 1.5.0 or 3.0.0). What versions do 
you use exactly?

 

> Error while connecting to S3 using Scoop
> ----------------------------------------
>
>                 Key: SQOOP-3385
>                 URL: https://issues.apache.org/jira/browse/SQOOP-3385
>             Project: Sqoop
>          Issue Type: Bug
>          Components: connectors
>    Affects Versions: 1.4.7
>            Reporter: Suchit
>            Priority: Minor
>              Labels: S3
>
> I am facing an issue while trying to import file from On Prem DB to S3 using 
> Sqoop.
> Things I an able to do-
> 1- I am connected to S3 , able to run aws s3 ls & other AWS cli commands
> 2- Able to generate a file connecting to DB to local Unix box.
> But when I change the target directory to S3 instead of locat I am getting 
> below error-
>  
>  "ERROR tool.ImportTool: Import failed: AWS Access Key ID and Secret Access 
> Key must be specified as the username or password (respectively) of a s3 URL, 
> or by setting the fs.s3.awsAccessKeyId or fs.s3.awsSecretAccessKey properties 
> (respectively)."
>   
>  Ideally the Sqoop installation should be able to pick up the credentials 
> from credential file inside .aws directory of the user running the command 
> but  Is there a way I can specify the credentials?
>   
>  Thanks in advance.
>   



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to