Re: Setting s3 credentials in cloudera

2014-04-21 Thread Kishore kumar
With the same credentials I am able to download the s3 file to my local filesystem. On Tue, Apr 22, 2014 at 11:17 AM, Kishore kumar wrote: > No, I am running in cli. > > > On Mon, Apr 21, 2014 at 8:43 PM, j.barrett Strausser < > j.barrett.straus...@gmail.com> wrote: > >> You mention cloudera, ar

Re: Setting s3 credentials in cloudera

2014-04-21 Thread Kishore kumar
No, I am running in cli. On Mon, Apr 21, 2014 at 8:43 PM, j.barrett Strausser < j.barrett.straus...@gmail.com> wrote: > You mention cloudera, are you trying to execute the query from HUE? That > requires altering the setting for HUE and not HIVE. > > > On Mon, Apr 21, 2014 at 11:12 AM, j.barret

Re: question about hive sql

2014-04-21 Thread Shengjun Xin
You need to check the container log for the details On Tue, Apr 22, 2014 at 10:27 AM, EdwardKing wrote: > I use hive under hadoop 2.2.0, first I start hive > [hadoop@master sbin]$ hive > 14/04/21 19:06:32 INFO Configuration.deprecation: > mapred.input.dir.recursive is deprecated. Instead, use

Re: Hive 0.13.0 - IndexOutOfBounds Exception

2014-04-21 Thread Prasanth Jayachandran
Hi Bryan Can you provide more information about the input and output tables? Schema? Partitioning and bucketing information? Explain plan of your insert query? These information will help to diagnose the issue. Thanks Prasanth Sent from my iPhone > On Apr 21, 2014, at 7:00 PM, Bryan Jeffre

question about hive sql

2014-04-21 Thread EdwardKing
I use hive under hadoop 2.2.0, first I start hive [hadoop@master sbin]$ hive 14/04/21 19:06:32 INFO Configuration.deprecation: mapred.input.dir.recursive is deprecated. Instead, use mapreduce.input.fileinputformat.input.dir.recursive 14/04/21 19:06:32 INFO Configuration.deprecation: mapred.max.spl

Hive 0.13.0 - IndexOutOfBounds Exception

2014-04-21 Thread Bryan Jeffrey
Hello. I am running Hadoop 2.4.0 and Hive 0.13.0. I am encountering the following error when converting a text table to ORC via the following command: Error: Diagnostic Messages for this Task: Error: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Erro

Re: [ANNOUNCE] Apache Hive 0.13.0 Released

2014-04-21 Thread Harish Butani
The link to the Release Notes is wrong. Thanks Szehon Ho for pointing this out. The correct link is: https://issues.apache.org/jira/secure/ReleaseNote.jspa?version=12324986&styleName=Text&projectId=12310843 On Mon, Apr 21, 2014 at 4:23 PM, Thejas Nair wrote: > Thanks to Harish for all the hard

Re: [ANNOUNCE] Apache Hive 0.13.0 Released

2014-04-21 Thread Thejas Nair
Thanks to Harish for all the hard work managing and getting the release out! This is great news! This is a significant release in hive! This has more than twice the number of jiras included (see release note link), compared to 0.12, and earlier releases which were also out after a similar gap of 5

Re: Meta data tables - Hive

2014-04-21 Thread Alan Gates
Hive does not have a traditional SQL information schema. Instead it uses MySQL style show/describe. So it has show tables, etc. See https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL#LanguageManualDDL-Show Alan. On Apr 21, 2014, at 7:10 AM, Ravi Prasad wrote: > Hi all, >

Re: Executing Hive Queries in Parallel

2014-04-21 Thread Subramanian, Sanjay (HQP)
Hey Instead of going into HIVE CLI I would propose 2 ways NOHUP nohup hive -f path/to/query/file/hive1.hql >> ./hive1.hql_`date +%Y-%m-%d-%H–%M–%S`.log 2>&1 nohup hive -f path/to/query/file/hive2.hql >> ./hive2.hql_`date +%Y-%m-%d-%H–%M–%S`.log 2>&1 nohup hive -f path/to/query/file/hive3.hql >>

Executing Hive Queries in Parallel

2014-04-21 Thread saurabh
Hi, I need some inputs to execute hive queries in parallel. I tried doing this using CLI (by opening multiple ssh connection) and executed 4 HQL's; it was observed that the queries are getting executed sequentially. All the FOUR queries got submitted however while the first one was in execution mod

"create table as" fails with error

2014-04-21 Thread Subramanian, Sanjay (HQP)
Hey guys THIS QUERY FAILS create table olena.temp8 as select * from olena.temp7 group by person_id, level, technical, business, liberalarts, lifesciences, other, school_name, degree_complete_flag FAILED: Error in metadata: InvalidObjectException(message:temp8 is not a valid object name) 14/0

Re: Help - Hadoop jar null org.apache.hadoop.hive.ql.exec.ExecDriver

2014-04-21 Thread Chinna Rao Lalam
Hi, Check may be hive-exec.jar is corrupted. Hope It Helps, Chinna Rao Lalam On Sat, Apr 19, 2014 at 2:38 AM, Abhishek Girish wrote: > Hello, > > I am hitting an error while executing a Hive job inside MapReduce: > > *Code snippet:* > > String select1 = "SELECT a FROM abc"; > > driver.run(s

Re: All Hive JDBC Queries Fail with Same Error: “Caused by: java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRedTa

2014-04-21 Thread Chinna Rao Lalam
Hi, Here the MR job is failed, Check why the MR job is failed (From the job logs) .. Hope It Helps, Chinna Rao Lalam On Fri, Apr 18, 2014 at 9:53 PM, Vince George (vincgeor) wrote: > We have just configured a new Hive JDBC client with an upgraded support > for per user Kerberos authenticati

Re: Setting s3 credentials in cloudera

2014-04-21 Thread j.barrett Strausser
You mention cloudera, are you trying to execute the query from HUE? That requires altering the setting for HUE and not HIVE. On Mon, Apr 21, 2014 at 11:12 AM, j.barrett Strausser < j.barrett.straus...@gmail.com> wrote: > Hope those aren't you actual credentials. > > > On Mon, Apr 21, 2014 at 11

Re: Setting s3 credentials in cloudera

2014-04-21 Thread j.barrett Strausser
Hope those aren't you actual credentials. On Mon, Apr 21, 2014 at 11:05 AM, Kishore kumar wrote: > I Edited "Cluster-wide Configuration Safety Valve for core-site.xml" in > cm, and specified as below, but still the problem is same. > > > fs.s3.awsAccessKeyId > AKIAJNIM5P2SASWJPHSA > > > > fs.

Re: Setting s3 credentials in cloudera

2014-04-21 Thread Kishore kumar
I Edited "Cluster-wide Configuration Safety Valve for core-site.xml" in cm, and specified as below, but still the problem is same. fs.s3.awsAccessKeyId AKIAJNIM5P2SASWJPHSA fs.s3.awsSecretAccessKey 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 FAILED: Error in metadata: MetaException(message:java

Re: Setting s3 credentials in cloudera

2014-04-21 Thread Kishore kumar
I set the credentials from hive command line, still I am getting the error. please help me. hive> set fs.s3.awsAccessKeyId = x; hive> set fs.s3.awsSecretAccessKey = xxx; FAILED: Error in metadata: MetaException(message:java.lang.IllegalArgumentException: AWS Access Key ID and

Meta data tables - Hive

2014-04-21 Thread Ravi Prasad
Hi all, In Hive do we have any meta data tables where I can see all the table's / view's / Indexes information. For example, in Oracle, we have *USER_TABLES* meta data table to know about all the tables available in the user. *USER_VIEWS* to know about all the view information available in

Setting s3 credentials in cloudera

2014-04-21 Thread Kishore kumar
Hi Experts, I am trying to create table against my s3 file, I faced the below issue, where to set these credentials in clouderamanager4.8. I got this link ( http://community.cloudera.com/t5/Cloudera-Manager-Installation/AWS-Access-Key-ID-and-Secret-Access-Key-must-be-specified-as-the/td-p/495) aft

Analyzing data resides on s3 from local hadoop cluster

2014-04-21 Thread Kishore kumar
Hi Experts, We are running four node cluster which is installed cdh4.5 with cm4.8, We have large size files in zip format in s3, we want to analyze that files for every hour in hive, which is the best way to do that, please help me with examples or with any reference links. -- Thanks, *Kishore

Analyzing data resides on s3 from local hadoop cluster

2014-04-21 Thread Kishore kumar
Hi Experts, After I changed the column names in hive table,m the result showing all null values with new column names, if i query with select * from table giving the actual values result, what could be the problem please explain what should i do now, help me. -- Thanks, *Kishore *