[ https://issues.apache.org/jira/browse/HIVE-14373?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15415622#comment-15415622 ]
Abdullah Yousufi commented on HIVE-14373: ----------------------------------------- Hey [~yalovyyi], currently the best way to switch between different s3 clients would be to use the different key names in auth-keys.xml. I created auth-keys.xml.template as an s3a example, but that could be easily changed for s3n. However, I agree that the bucket variable name in that file should not be specific to s3a. Also thanks a ton for the review on the patch, I'll get to that shortly. > Add integration tests for hive on S3 > ------------------------------------ > > Key: HIVE-14373 > URL: https://issues.apache.org/jira/browse/HIVE-14373 > Project: Hive > Issue Type: Sub-task > Reporter: Sergio Peña > Assignee: Abdullah Yousufi > Attachments: HIVE-14373.patch > > > With Hive doing improvements to run on S3, it would be ideal to have better > integration testing on S3. > These S3 tests won't be able to be executed by HiveQA because it will need > Amazon credentials. We need to write suite based on ideas from the Hadoop > project where: > - an xml file is provided with S3 credentials > - a committer must run these tests manually to verify it works > - the xml file should not be part of the commit, and hiveqa should not run > these tests. > https://wiki.apache.org/hadoop/HowToContribute#Submitting_patches_against_object_stores_such_as_Amazon_S3.2C_OpenStack_Swift_and_Microsoft_Azure -- This message was sent by Atlassian JIRA (v6.3.4#6332)