Re: How to change the separator of INSERT OVERWRITE LOCAL DIRECTORY

2013-05-29 Thread Ted Xu
HI Felix, I believe it is bug not able to change field seperator when sinking data to files. It is already fixed in version 0.11.0. See https://issues.apache.org/jira/browse/HIVE-3682. On Wed, May 29, 2013 at 4:37 PM, Felix.徐 wrote: > Hi all, > > I am wondering how to change the fields separat

Re: Overwrite by selected data from table itself?

2013-05-29 Thread Felix . 徐
I've done in this way many times , there must be some errors in your script , you may paste your script here. 2013/5/30 Stephen Sprague > i think it's a clever idea. Can you reproduce this behavior via a simple > example and show it here? I ran a test on hive 0.80 and it worked as you > wou

RE: Hive - max rows limit (int limit = 2^31). need Help (looks liek a bug)

2013-05-29 Thread Gabi Kazav
Thanks for helping. Here is some more data: create table max_sint_rows (s1 string) partitioned by (p1 string) ROW FORMAT DELIMITED LINES TERMINATED BY '\n'; Create table small_table (p1 string) ROW FORMAT DELIMITED LINES TERMINATED BY '\n'; alter table max_sint_rows add partition (p1="

Re: Hive - max rows limit (int limit = 2^31). need Help (looks liek a bug)

2013-05-29 Thread John Meagher
What is the data type of the p1 column? I've used hive with partitions containing far above 2 billion rows without having any problems like this. On Wed, May 29, 2013 at 2:41 PM, Gabi Kazav wrote: > Hi, > > > > We are working on hive DB with our Hadoop cluster. > > We now facing an issue about j

Re: Combining 2 JSON objects in Hive

2013-05-29 Thread Stephen Sprague
I know of no way to do this purely natively within hive, however, don't let that stop you. Enter the transform() function. Write your JSON merge using python, perl, ruby or whatever floats your boat. Don't let the gnarly syntax on this page scare you: https://cwiki.apache.org/confluence/display

Re: Overwrite by selected data from table itself?

2013-05-29 Thread Stephen Sprague
i think it's a clever idea. Can you reproduce this behavior via a simple example and show it here? I ran a test on hive 0.80 and it worked as you would expect. Regards, Stephen. hisql>select * from junk; +-+ | _c0 | +-+ | 1 | +-+ 1 affected hisql>insert overwrite table junk sel

Hive - max rows limit (int limit = 2^31). need Help (looks liek a bug)

2013-05-29 Thread Gabi Kazav
Hi, We are working on hive DB with our Hadoop cluster. We now facing an issue about joining a big partition with more than 2^31 rows. When the partition has more than 2147483648 rows (even 2147483649) the output of the join is a single row. When the partition has less than 2147483648 rows (event

RE: Accessing Table Properies from InputFormat

2013-05-29 Thread Peter Marron
Hi, I am a newbie and I don't want to break any layered abstractions. I am in the situation where I want to be able to examine the predicate in the query and if it's a filter that I recognize then I would like to use it to cut down on the number of records processed. In particular I would like to

Overwrite by selected data from table itself?

2013-05-29 Thread Bhathiya Jayasekara
Hi all, I have this scenario to remove certain rows from a hive table. As far as I understand, hive doesn't provide that functionality. So, I'm trying to select inverse of what I want to delete and overwrite the table with that. What do you think of this approach? I tried to do it but seems it d

Re: Hive Map Join Memory Limit

2013-05-29 Thread Jaideep Dhok
Peter, Looks are you getting the error in hive shell? You can control client memory usage by setting HADOOP_HEAPSIZE in conf/hadoop-env.sh. Thanks, Jaideep On Mon, May 27, 2013 at 12:34 AM, Peter Chu wrote: > Hi, I ran into memory problem while using Map Join. Errors below, how do > I increa

How to change the separator of INSERT OVERWRITE LOCAL DIRECTORY

2013-05-29 Thread Felix . 徐
Hi all, I am wondering how to change the fields separator of INSERT OVERWRITE LOCAL DIRECTORY , does anyone have experience doing this ? thanks!