release includes patches to
Driver.java to not use UnixUserGroupInformation and instead make calls to
a
shim layer that works with the authentication system in Hadoop.
Are there features in hive trunk/0.6 that you are interested in using but
are not available in cdh3b3 Hive?
-Vinithra
On Mon, Nov 8, 2010 at
Hi, There:
I was trying to compile trunk version of hive with cloudera cdh3b3
distribution and noticed that in
org.apache.hadoop.hive.ql Driver.java was using:
org.apache.hadoop.security.UnixUserGroupInformation;
However, in the cdh3b3 version of hadoop, this class is gone, so Driver.java
do
elegant.
Jimmy.
--
From: "Ted Yu"
Sent: Tuesday, October 12, 2010 4:33 PM
To:
Subject: Re: blob handling in hive
How about utf-8 encode your blob and store in Hive as String ?
On Tue, Oct 12, 2010 at 4:20 PM, Jinsong Hu
wrote:
I tho
about creating org.apache.hadoop.hive.serde2.io.BytesWritable which
wraps byte[] ?
On Tue, Oct 12, 2010 at 3:49 PM, Jinsong Hu
wrote:
storing the blob in hbase is too costly. hbase compaction costs lots of
cpu. All I want to do is to be able to read the byte array out of a
sequence
file, and map that byte a
PM
To:
Subject: Re: blob handling in hive
One way is to store blob in HBase and use HBaseHandler to access your
blob.
On Tue, Oct 12, 2010 at 2:14 PM, Jinsong Hu
wrote:
Hi,
I am using sqoop to export data from mysql to hive. I noticed that hive
don't have blob data type yet. is the
Hi,
I am using sqoop to export data from mysql to hive. I noticed that hive
don't have blob data type yet. is there anyway I can do so hive can store
blob ?
Jimmy