[ https://issues.apache.org/jira/browse/SQOOP-3385?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16623361#comment-16623361 ]
Steve Loughran commented on SQOOP-3385: --------------------------------------- BTW, if you have bin/hadoop on your command line, you can use this [store diagnostics tool|https://github.com/steveloughran/cloudstore] to work on getting your settings right before trying to get sqoop to pick them up. hope this helps > Error while connecting to S3 using Sqoop > ---------------------------------------- > > Key: SQOOP-3385 > URL: https://issues.apache.org/jira/browse/SQOOP-3385 > Project: Sqoop > Issue Type: Bug > Components: connectors > Affects Versions: 1.4.7 > Reporter: Suchit > Priority: Minor > Labels: S3 > > I am facing an issue while trying to import file from On Prem DB to S3 using > Sqoop. > Things I an able to do- > 1- I am connected to S3 , able to run aws s3 ls & other AWS cli commands > 2- Able to generate a file connecting to DB to local Unix box. > But when I change the target directory to S3 instead of locat I am getting > below error- > > "ERROR tool.ImportTool: Import failed: AWS Access Key ID and Secret Access > Key must be specified as the username or password (respectively) of a s3 URL, > or by setting the fs.s3.awsAccessKeyId or fs.s3.awsSecretAccessKey properties > (respectively)." > > Ideally the Sqoop installation should be able to pick up the credentials > from credential file inside .aws directory of the user running the command > but Is there a way I can specify the credentials? > > Thanks in advance. > -- This message was sent by Atlassian JIRA (v7.6.3#76005)