nuary 27, 2011 11:21 AM
>
> *To:* user@hive.apache.org
> *Subject:* RE: Hive Error on medium sized dataset
>
>
>
> I removed the part of the SerDe that handled the arbitrary key/value pairs
> and I was able to process my entire data set. Sadly the part I removed has
> all the int
Subject: RE: Hive Error on medium sized dataset
I removed the part of the SerDe that handled the arbitrary key/value pairs and
I was able to process my entire data set. Sadly the part I removed has all the
interesting data.
I'll play more with the heap settings and see if that lets me proces
rrect way to set the child heap value?
Thanks,
Pat
From: Christopher, Pat
Sent: Thursday, January 27, 2011 10:27 AM
To: user@hive.apache.org
Subject: RE: Hive Error on medium sized dataset
It will be tricky to clean up the data format as I'm operating on somewhat
arbitrary key-value pairs
my
mapred-site.xml:
mapred.child.java.opts
-Xm512M
Is that how I'm supposed to do that?
Thanks,
Pat
From: hadoop n00b [mailto:new2h...@gmail.com]
Sent: Wednesday, January 26, 2011 9:09 PM
To: user@hive.apache.org
Subject: Re: Hive Error on medium sized dataset
We typically get t
We typically get this error while running complex queries on our 4-node
setup when the child JVM runs out of heap size. Would be interested in what
the experts have to say about this error.
On Thu, Jan 27, 2011 at 7:27 AM, Ajo Fod wrote:
> Any chance you can convert the data to a tab separated t
Any chance you can convert the data to a tab separated text file and try the
same query?
It may not be the SerDe, but it may be good to isolate that away as a
potential source of the problem.
-Ajo.
On Wed, Jan 26, 2011 at 5:47 PM, Christopher, Pat <
patrick.christop...@hp.com> wrote:
> Hi,
>
>