With the same credentials I am able to download the s3 file to my local
filesystem.


On Tue, Apr 22, 2014 at 11:17 AM, Kishore kumar <kish...@techdigita.in>wrote:

> No, I am running in cli.
>
>
> On Mon, Apr 21, 2014 at 8:43 PM, j.barrett Strausser <
> j.barrett.straus...@gmail.com> wrote:
>
>> You mention cloudera, are you trying to execute the query from HUE?  That
>> requires altering the setting for HUE and not HIVE.
>>
>>
>> On Mon, Apr 21, 2014 at 11:12 AM, j.barrett Strausser <
>> j.barrett.straus...@gmail.com> wrote:
>>
>>> Hope those aren't you actual credentials.
>>>
>>>
>>> On Mon, Apr 21, 2014 at 11:05 AM, Kishore kumar 
>>> <kish...@techdigita.in>wrote:
>>>
>>>> I Edited "Cluster-wide Configuration Safety Valve for core-site.xml"
>>>> in cm, and specified as below, but still the problem is same.
>>>>
>>>> <property>
>>>> <name>fs.s3.awsAccessKeyId</name>
>>>> <value>AKIAJNIM5P2SASWJPHSA</value>
>>>> </property>
>>>>
>>>> <property>
>>>> <name>fs.s3.awsSecretAccessKey</name>
>>>> <value>BN1hkKD7JY4LGGNbjxmnFE0ehs12vXmP44GCKV2N</value>
>>>> </property>
>>>>
>>>>
>>>> FAILED: Error in metadata:
>>>> MetaException(message:java.lang.IllegalArgumentException: AWS Access Key ID
>>>> and Secret Access Key must be specified as the username or password
>>>> (respectively) of a s3 URL, or by setting the fs.s3.awsAccessKeyId or
>>>> fs.s3.awsSecretAccessKey properties (respectively).)
>>>> FAILED: Execution Error, return code 1 from
>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>>
>>>> Thanks,
>>>> Kishore.
>>>>
>>>>
>>>> On Mon, Apr 21, 2014 at 8:17 PM, Kishore kumar 
>>>> <kish...@techdigita.in>wrote:
>>>>
>>>>> I set the credentials from hive command line, still I am getting the
>>>>> error. please help me.
>>>>>
>>>>> hive> set fs.s3.awsAccessKeyId = xxxxxxxxx;
>>>>>  hive> set fs.s3.awsSecretAccessKey = xxxxxxxxxxxxxxx;
>>>>>
>>>>> FAILED: Error in metadata:
>>>>> MetaException(message:java.lang.IllegalArgumentException: AWS Access Key 
>>>>> ID
>>>>> and Secret Access Key must be specified as the username or password
>>>>> (respectively) of a s3 URL, or by setting the fs.s3.awsAccessKeyId or
>>>>> fs.s3.awsSecretAccessKey properties (respectively).)
>>>>> FAILED: Execution Error, return code 1 from
>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>>>
>>>>> Thanks,
>>>>> Kishore.
>>>>>
>>>>>
>>>>>
>>>>> On Mon, Apr 21, 2014 at 7:33 PM, Kishore kumar 
>>>>> <kish...@techdigita.in>wrote:
>>>>>
>>>>>> Hi Experts,
>>>>>>
>>>>>> I am trying to create table against my s3 file, I faced the below
>>>>>> issue, where to set these credentials in clouderamanager4.8. I got this
>>>>>> link (
>>>>>> http://community.cloudera.com/t5/Cloudera-Manager-Installation/AWS-Access-Key-ID-and-Secret-Access-Key-must-be-specified-as-the/td-p/495)
>>>>>> after some research but please explain me clearly after edited
>>>>>> "Cluster-wide Configuration Safety Valve for core-site.xml" how to
>>>>>> specify the values.
>>>>>>
>>>>>> -- Thanks,
>>>>>>
>>>>>>
>>>>>> *Kishore *
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>>
>>>>>
>>>>
>>>>
>>>> --
>>>>
>>>>
>>>
>>>
>>> --
>>>
>>>
>>> <https://github.com/bearrito>
>>>
>>
>
> --
>
>


-- 

*Kishore Kumar*
ITIM

Reply via email to