Re: Class not found exception from serde

2014-02-25 Thread Andreas Koch
That is correct I added the jar file with ADD JAR. Thanks, for the patch. On Mon, Feb 24, 2014 at 11:44 PM, Jason Dere wrote: > I'm assuming the jar was loaded using ADD JAR as opposed to the jar being > on the classpath? > > On Feb 24, 2014, at 2:40 PM, Jason Dere wrote: > > I think TableDes

Re: part-m-00000 files and their size - Hive table

2014-02-25 Thread Raj Hadoop
Thanks for the detailed explanation Yong. It helps. Regards, Raj On Tuesday, February 25, 2014 9:18 PM, java8964 wrote: Yes, it is good that the file sizes are evenly close, but not very important, unless there are files very small (compared to the block size). The reasons are: Your fil

RE: part-m-00000 files and their size - Hive table

2014-02-25 Thread java8964
Yes, it is good that the file sizes are evenly close, but not very important, unless there are files very small (compared to the block size). The reasons are: Your files should be splitable to be used in Hadoop (Or in Hive, it is the same thing). If they are splitable, then 1G file will use 10 bl

RE: hive query to calculate percentage

2014-02-25 Thread java8964
one query won't work, as totalcount is not in "group by". You have 2 options: 1) use the sub query select a.timestamp_dt, a.totalcount/b.total_sumfrom daily_count_per_kg_domain a join(select timestamp_dt, sum(totalcount) as total_sumfromdaily_count_per_kg_domaingroup by timestamp_dt) b on (a.tim

part-m-00000 files and their size - Hive table

2014-02-25 Thread Raj Hadoop
Hi, I am loading data to HDFS files through sqoop and creating a Hive table to point to these files. The mapper files through sqoop example are generated like this below. part-m-0 part-m-1 part-m-2 My question is - 1) For Hive query performance , how important or significant is

Re: Can a hive partition contain a string like 'tr_date=2014-01-01'

2014-02-25 Thread Raj Hadoop
Thanks. Will try it. On Tuesday, February 25, 2014 8:23 PM, Kuldeep Dhole wrote: Probably you should use tr_date='2014-01-01' Considering tr_date partition is there On Tuesday, February 25, 2014, Raj Hadoop wrote: I am trying to create a Hive partition like 'tr_date=2014-01-01' > > >FAI

Re: Can a hive partition contain a string like 'tr_date=2014-01-01'

2014-02-25 Thread Kuldeep Dhole
Probably you should use tr_date='2014-01-01' Considering tr_date partition is there On Tuesday, February 25, 2014, Raj Hadoop wrote: > I am trying to create a Hive partition like 'tr_date=2014-01-01' > > FAILED: ParseException line 1:58 mismatched input '-' expecting ) near > '2014' in add parti

Can a hive partition contain a string like 'tr_date=2014-01-01'

2014-02-25 Thread Raj Hadoop
I am trying to create a Hive partition like 'tr_date=2014-01-01' FAILED: ParseException line 1:58 mismatched input '-' expecting ) near '2014' in add partition statement hive_ret_val:  64 Errors while executing Hive for bksd table for 2014-01-01 Are hyphen's not allowed in the partition directo

Re: hive query to calculate percentage

2014-02-25 Thread Krishnan K
Modfiy the query to : select totalcount / sum(totalcount) from daily_count_per_kg_domain where timestamp_dt = '20140219' group by timestamp_dt; if you dont specify the where clause, you will get result for all partitions. On Tue, Feb 25, 2014 at 3:14 PM, Manish wrote: > I have a partitioned ta

hive query to calculate percentage

2014-02-25 Thread Manish
I have a partitioned table on timestamp_dt: > desc daily_count_per_kg_domain; OK ddnamesyskg string totalcount int timestamp_dtstring hive> select * from daily_count_per_kg_domain; OK sys_kg_band 224 20140219 sys_kg_event343520140219 sys_kg_movies 44987 20140219

RE: java.lang.RuntimeException: cannot find field key from [0:_col0, 1:_col2, 2:_col3]

2014-02-25 Thread java8964
Works for me on 0.10. Yong Date: Tue, 25 Feb 2014 11:37:32 -0800 From: kumarbuyonl...@yahoo.com Subject: Re: java.lang.RuntimeException: cannot find field key from [0:_col0, 1:_col2, 2:_col3] To: user@hive.apache.org Hi,Thanks for looking into it.I am also trying this on hive 0.11 to see if

Re: java.lang.RuntimeException: cannot find field key from [0:_col0, 1:_col2, 2:_col3]

2014-02-25 Thread Kumar V
Hi,     Thanks for looking into it. I am also trying this on hive 0.11 to see if it works there.  If you get a chance to reproduce this problem on hive 0.10, please let me know. Thanks. On Monday, February 24, 2014 10:59 PM, java8964 wrote: My guess is that your UDTF will return an array of

Re: Hive 0.12 status

2014-02-25 Thread Bhaskar Dutta
On Tue, Feb 25, 2014 at 10:11 PM, Edward Capriolo wrote: > All stable really is is a sym link, Hive is heavily unit and integration > tested. Also the release is not made after some manual testing as well. > releases have historically been very stable. 12 has been out for some time. > > > On Tue,

Re: Hive 0.12 status

2014-02-25 Thread Edward Capriolo
All stable really is is a sym link, Hive is heavily unit and integration tested. Also the release is not made after some manual testing as well. releases have historically been very stable. 12 has been out for some time. On Tue, Feb 25, 2014 at 9:57 AM, Bhaskar Dutta wrote: > Hi, > > Is the Hiv

Hive 0.12 status

2014-02-25 Thread Bhaskar Dutta
Hi, Is the Hive 0.12 release still not considered "stable"? The stable directory in the apache mirrors show Hive 0.11 as the stable version. I am testing it with Hadoop 2.2.0 and Oozie 4.0.0 which is same as the HDP 2.0 stack. >>>On the mirror, all recent releases are available, but are not guar

configuring Hive to automatically create _SUCCESS files

2014-02-25 Thread centerqi hu
Hello, is there a way to configure Hive 0.9 so that it creates _SUCCESS files similar to Hadoop MapReduce and Pig? thx -- cente...@gmail.com|qizhong

Creating managed location with location

2014-02-25 Thread Supriya Sahay
Hi I was trying to create an internal table specifying the location as well in the create table statement. This is what I used: *create table test.employees(** **name String comment 'Employee name',** **salary float comment 'Employee salary',** **subordinates array comment 'Names of subordinat