IMPORTANT: access to ox3 non-archived hive tables

2013-10-08 Thread Roberto Congiu
Hey guys, on the storage grid there are some tables (we called them 'raw') that actually reside on the main grid and are supposed to be accessed only by support engineers and by people with special needs that have discussed it with the grid team. That is because querying them affects the producti

Re: HCatLoader / HCatStorer use WebHCat?

2013-10-08 Thread Timothy Potter
Ok - jarjar utility to the rescue for now then ... I'll open a feature request in JIRA and am interested in working on this. Cheers, Tim On Tue, Oct 8, 2013 at 5:28 PM, Sushanth Sowmyan wrote: > Currently, they don't. However, I do see that being potentially useful. > On Oct 8, 2013 11:03 AM,

Re: Hive doesn't support special characters in column name

2013-10-08 Thread Edward Capriolo
It is fairly common not to allow variable names to begin with an _. Also many frameworks like thrift and protobuf have similar rules. Compilers have similar rules as well. It is not going to be an easy change to make, spaces are bad for sure. It is probably easier to write some code to map special

Re: HCatLoader / HCatStorer use WebHCat?

2013-10-08 Thread Sushanth Sowmyan
Currently, they don't. However, I do see that being potentially useful. On Oct 8, 2013 11:03 AM, "Timothy Potter" wrote: > I use CassandraStorage and HCatalog in my Pig analytics jobs and they are > using different versions of Thrift. I'm curious if the Pig load / store > funcs could use the WebH

Re: Hive doesn't support special characters in column name

2013-10-08 Thread Zhang Xiaoyu
Thanks, Nitin, If I modify the antlr file to allow column name accept those special characters. What side effect potentially could be ? Will it possible break the query parsing? Johnny On Tue, Oct 8, 2013 at 12:09 PM, Nitin Pawar wrote: > In hive the column names are restricted by alpha-numeric

Re: Capturing HIVE CLI errors

2013-10-08 Thread Stephen Sprague
The thrift libraries throw errors back to the client as one would expect. Looks like we're going to have to see some example code where you're not seeing them. On Tue, Oct 8, 2013 at 2:57 AM, praveenesh kumar wrote: > Hi guys, > > I am trying to write a client code for HIVE CLI via ODBC connecto

RE: Execution failed with exit status: 3

2013-10-08 Thread Martin, Nick
Thanks for the suggestion. We have the user perms configured appropriately, as we're able to run other Hive queries that kick off MR jobs and complete them successfully. MR1 HWX suggested it might have something to do with hive.auto.convert.join = true and hive.mapjoin.smalltable.filesize=25MB

Re: Hive doesn't support special characters in column name

2013-10-08 Thread Nitin Pawar
In hive the column names are restricted by alpha-numeric and _ (and not beginning with _) Why it was done ? For my understanding, to avoid the unnecessary complexity in query parsing or grammer building. hive dev will have definite answer :) On Tue, Oct 8, 2013 at 11:56 PM, Zhang Xiaoyu wrote:

Re: Execution failed with exit status: 3

2013-10-08 Thread Sanjay Subramanian
Hi Are u running this thru Beeswax in Hue ? If I recall right then u might need to provide access to "hue" user to submit and run MR jobs on the cluster Also r u using YARN or MR1 ? Thanks Regards sanjay From: , Nick mailto:nimar...@pssd.com>> Reply-To: "user@hive.apache.org

Re: JSON format files versus AVRO

2013-10-08 Thread Sanjay Subramanian
Hi Thanks I have to still check out JsonSerDe in catalog. U r right an I did think about adding the unique key as an attribute inside the JSON Instead of analyzing further I am going to try both methods out and see how my down the stream processes will work. I have a 40 step Oozie workflow that

Re: JSON format files versus AVRO

2013-10-08 Thread Sushanth Sowmyan
Have you had a look at the JsonSerDe in hcatalog to see if it suits your need? It does not support the format you are suggesting directly, but if you made the unique I'd part of the json object, so that each line was a json record, it would. It's made to be used in conjunction with text tables. A

Hive doesn't support special characters in column name

2013-10-08 Thread Zhang Xiaoyu
Hi, The columns in my Hive table had better include some special characters like / # - However, I found Hive doesn't allow it, and doesn't allow using \ as escape character. all below query fail (1) create table test ( "user\/hive" String); create table test ( "user\\/hive" String); create table

HCatLoader / HCatStorer use WebHCat?

2013-10-08 Thread Timothy Potter
I use CassandraStorage and HCatalog in my Pig analytics jobs and they are using different versions of Thrift. I'm curious if the Pig load / store funcs could use the WebHCat REST interface instead of Thrift? Thanks. Tim

RE: Execution failed with exit status: 3

2013-10-08 Thread Martin, Nick
Update on this... When I run this in Hive CLI it works perfectly. Only has a bug in Hue. I'll send this thread over to hue_user@ and see what they say. From: Martin, Nick [mailto:nimar...@pssd.com] Sent: Tuesday, October 08, 2013 12:17 PM To: user@hive.apache.org Subject: RE: Execution failed wi

Re: Execution failed with exit status: 3

2013-10-08 Thread Nitin Pawar
Hi Martin, Are these tables created by any serde? On Tue, Oct 8, 2013 at 9:47 PM, Martin, Nick wrote: > Hi Sanjay, thanks for the suggestion. > > ** ** > > There are no partitions on either table. > > ** ** > > *From:* Sanjay Subramanian [mailto:sanjay.subraman...@wizecommerce.com]

RE: Execution failed with exit status: 3

2013-10-08 Thread Martin, Nick
Hi Sanjay, thanks for the suggestion. There are no partitions on either table. From: Sanjay Subramanian [mailto:sanjay.subraman...@wizecommerce.com] Sent: Monday, October 07, 2013 8:19 PM To: user@hive.apache.org Subject: Re: Execution failed with exit status: 3 Hi Nick How many partitions are

Capturing HIVE CLI errors

2013-10-08 Thread praveenesh kumar
Hi guys, I am trying to write a client code for HIVE CLI via ODBC connector. I want to add query validation code at my client side. I was wondering is there a way I can capture the Hive query syntax errors which I can use to validate at my client end. I don't want to write my own validation codes

Re: Wikipedia Dump Analysis..

2013-10-08 Thread Ajeet S Raina
I am not restricted to finding contributor location.That was just one thought which came to my mind. I would like to know what analysis could be done with Wikipedia. The Wikipedia is in the form of xml dump which is loaded into hdfs and hive created two column for it. On 8 Oct 2013 13:57, "Sonal

Re: Wikipedia Dump Analysis..

2013-10-08 Thread Sonal Goyal
Hi Ajeet, Unfortunately, many of us are not familiar with the Wikipedia format as to where the contributor information is coming from. If you could please highlight that and let us know where you are stuck with Hive, we could throw some ideas.. Sonal Best Regards, Sonal Nube Technologies