RCfile is not working with BZip2. Interesting in using LZO in general.

2011-03-02 Thread phil young
I'm wondering if my configuration/stack is wrong, or if I'm trying to do something that is not supported in Hive. My goal is to choose a compression scheme for Hadoop/Hive and while comparing configurations, I'm finding that I can't get BZip2 or Gzip to work with the RCfile format. Is that supporte

Re: Trouble using mysql metastore

2011-03-02 Thread Viral Bajaria
This definitely looks like a CLASSPATH error. Where did you get the mysql.jar from ? Can you open it up and make sure that it includes the com.mysql.jdbc.Driver namespace ? I am guessing the mysql.jar is not the one that you need. you can download a new one from the mysql website. To be clear, I

Re: Trouble using mysql metastore

2011-03-02 Thread Ajo Fod
Hi Bennie, Thanks for the response ! I had CLASSPATH set to include /usr/share/java/mysql.jar ... in addition, I just copied the mysql.jar to the lib directory of hive. I still get the same bug. Any other ideas? Thanks, -Ajo On Wed, Mar 2, 2011 at 7:01 AM, Bennie Schut wrote: > Usually t

Re: Associative Arrays in Hive?

2011-03-02 Thread 김영우
Refer to this http://dev.bizo.com/2011/02/columns-in-hive.html HTH - Youngwoo 2011/3/2 Sunderlin, Mark > Let us say my log data that I want to place a log file into hive. And > the log file itself looks something like this: > > > > Event_time

Re: Associative Arrays in Hive?

2011-03-02 Thread Edward Capriolo
On Wed, Mar 2, 2011 at 9:27 AM, Sunderlin, Mark wrote: > Let us say my log data that I want to place a log file into hive. And > the log file itself looks something like this: > > > > Event_time, event_type, event_data_blob > > > > And the blob data looks like > > “Key1=value1;key2=value2;key3=v

Re: Trouble using mysql metastore

2011-03-02 Thread Bennie Schut
Usually this is caused by not having the mysql jdbc driver on the classpath (it's not default included in hive). Just put the mysql jdbc driver in the hive folder under "lib/" On 03/02/2011 03:15 PM, Ajo Fod wrote: I've checked the mysql connection with a separate java file with the same string

Re: cannot start the transform script. reason : "argument list too long"

2011-03-02 Thread Dave Brondsema
We've gotten this error a couple of times too - it is very misleading, not correct at all. IIRC, I determined the root cause is selecting too many input files (even though those do NOT get passed as arguments to transform script). For example, this happened once we had a lot of dynamic partitions

Associative Arrays in Hive?

2011-03-02 Thread Sunderlin, Mark
Let us say my log data that I want to place a log file into hive. And the log file itself looks something like this: Event_time, event_type, event_data_blob And the blob data looks like "Key1=value1;key2=value2;key3=value3 ... keyn=valuen" This looks like maybe I start like this: Create table