Thank you both,
A quick glance looks like that is what I am looking for. When I get
it working, I'll post the solution.
Cheers,
Tim
On Mon, Nov 8, 2010 at 6:55 AM, Namit Jain wrote:
> Other option would be to create a wrapper script (not use either UDF or
> UDTF)
> That script, in any language
Hi,
I am trying to test Dynamic-partition Insert. But this is not working as
expected. Kindly help how to solve this problem.
Created source table
--
CREATE EXTERNAL TABLE testmove (
a string,
b string
)
PARTITIONED BY (cust string, dt string);
Data has been kept in /u
Other option would be to create a wrapper script (not use either UDF or UDTF)
That script, in any language, can emit any number of output rows per input row.
Look at:
http://wiki.apache.org/hadoop/Hive/LanguageManual/Transform
for details
From: Sonal Goyal [so
Hey Tim,
You have an interesting problem. Have you tried creating a UDTF for your
case, so that you can possibly emit more than one record for each row of
your input?
http://wiki.apache.org/hadoop/Hive/DeveloperGuide/UDTF
Thanks and Regards,
Sonal
Sonal Goyal | Founder and CEO | Nube Technologi
Hi all,
I am porting custom MR code to Hive and have written working UDFs
where I need them. Is there a work around to having to do this in
Hive:
select * from
(
select name_id, toTileX(longitude,0) as x, toTileY(latitude,0) as
y, 0 as zoom, funct2(lontgitude, 0) as f2_x, funct2(latitude,0)