Hi All,
After digging in to the code more I realized that GroupbyOperator can be
present at the map side of the computation as well, in which case it's
doing partial computations. So in that case the terminate of UDAF will get
called for partial results. However for the queries that I tried the
te
BTW, one of those tables is partitioned and the other one isn't.
I don't know if that makes any difference.
Fernando
On Tue, Jan 15, 2013 at 4:59 PM, Fernando Andrés Doglio Turissini <
fernando.dog...@globant.com> wrote:
> Sorry about that, I'm using the columnar SerDe on both tables. Do you ne
Sorry about that, I'm using the columnar SerDe on both tables. Do you need
anything else?
I don't have the create tables for them, so I can't give you that
particular code.
On Tue, Jan 15, 2013 at 4:46 PM, Mark Grover wrote:
> I was more interested in knowing if you were using any particular SerD
I was more interested in knowing if you were using any particular SerDes.
You don't have to list out the columns, just the skeleton create table
statement should do.
On Tue, Jan 15, 2013 at 10:43 AM, Fernando Andrés Doglio Turissini <
fernando.dog...@globant.com> wrote:
> The "data_table" has aro
The "data_table" has around 5k fields, all doubles.
As for the "age_mean" table, here it is:
hive> desc age_mean;
OK
id string
name string
age_mean double
Time taken: 0.127 seconds
Does this help?
Thanks!
Fernando
On Tue, Jan 15, 2013 at 4:35 PM, Mark Grover wrote:
> Fernando,
> Could you shar
Fernando,
Could you share your table definitions as well please?
On Tue, Jan 15, 2013 at 10:31 AM, Fernando Andrés Doglio Turissini <
fernando.dog...@globant.com> wrote:
> Hello everyone, I'm struggling with an exception I'm getting on a
> particular query that's driving me crazy!
>
> Here is the
Hello everyone, I'm struggling with an exception I'm getting on a
particular query that's driving me crazy!
Here is the exception I get:
java.lang.RuntimeException:
org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while
processing writable org.apache.hadoop.hive.serde2.colum
n
I don't think this is the right list for your query. Moving hadoop list
to bcc and cc'ing hive list.
Also I don't get how you can get unix timestamp from a field with just
hour granularity, are you missing the date information in your date
format ?
Viral
From: Carolina Vizuete Martinez
Sent: 1/15