Re: hive error: "Too many bytes before delimiter: 2147483648"

2019-12-01 Thread Shawn Weeks
That looks like you’ve encountered a file with no delimiter as that’s near the max size for an array or string. Also I don’t think you can terminate fields with a line feed as that’s the hard coded row delimiter. Thanks Shawn From: xuanhuang <18351886...@163.com> Reply-To: "user@hive.apache.org

RE: Hive error : Can not convert struct<> to

2016-06-28 Thread Markovitz, Dudu
x27;: Cannot convert column 4 from struct to struct. -Original Message- From: Gopal Vijayaraghavan [mailto:go...@hortonworks.com] On Behalf Of Gopal Vijayaraghavan Sent: Tuesday, June 28, 2016 6:17 PM To: user@hive.apache.org Subject: Re: Hive error : Can not convert struct<> to

Re: Hive error : Can not convert struct<> to

2016-06-28 Thread Gopal Vijayaraghavan
> PARTITION(state='CA') > SELECT * WHERE se.adr.st='CA' > FAILED: SemanticException [Error 10044]: Line 2:23 Cannot insert into >target table because column number/types are different ''CA'': The error is bogus, but the issue has to do with the "SELECT *". Inserts where a partition is specified

RE: Hive error : Can not convert struct<> to

2016-06-28 Thread Markovitz, Dudu
Hi The fields' names are part of the struct definition. Different names, different types of structs. Dudu e.g. Setup create table t1 (s struct); create table t2 (s struct); insert into table t1 select named_struct('c1',1,'c2',2);

Re: hive error when trying to write data to s3n

2015-08-04 Thread Pun Intended
Yes, the explain plan definitely only has Move Operators (no Copy Operators). With that though, this definitely looks like a hive bug? Does anyone know if there is corresponding HIVE ticket or a workaround for the issue? Thanks! Stage: Stage-3 Move Operator files: hdfs direct

Re: hive error when trying to write data to s3n

2015-08-04 Thread Gopal Vijayaraghavan
> Moving data to: >s3n://:@my_bucket/a/b/2015-07-30/.hive-staging_hiv >e_2015-08-04_18-38-47_649_1476668515119011800-1/-ext-1 > Failed with exception Wrong FS: >s3n://:@my_bucket/a/b/2015-07-30/.hive-staging_hiv >e_2015-08-04_18-38-47_649_1476668515119011800-1/-ext-10002, expected: >hdfs://s

Re: Hive Error

2013-05-20 Thread Sanjay Subramanian
Hi Varun Can u attach the error logs…I don't seem to have the attachment Thaks sanjay From: Kasa V Varun mailto:kasa.va...@mu-sigma.com>> Reply-To: "user@hive.apache.org" mailto:user@hive.apache.org>> Date: Sunday, May 19, 2013 11:16 PM To: "user@hive.apache.org

Re: Hive error: Unable to deserialize reduce input key

2012-09-06 Thread Navis류승우
I've tried to deserialize your data. 0 = bigint = -6341068275337623706 1 = string = TTFVUFHFH 2 = int = -1037822201 3 = int = -1467607277 4 = int = -1473682089 5 = int = -1337884091 6 = string = I 7 = string = IVH ISH 8 = int = -1321908327 9 = int = -1475321453 10 = int = -1476394752 11 = string =

Re: Hive error: Unable to deserialize reduce input key

2012-09-06 Thread praveenesh kumar
I am not sure, what can be the issue...I had it long back and got no response. I tried these things: 1. Increased the Child JVM heap size. 2. Reduced the number of reducers for the job. 3. Check whether your disks are not getting full while running the query. 3. Checked my data again. I think many

Re: Hive error: Unable to deserialize reduce input key

2012-09-06 Thread 曹坤
Hi praveenesh kumar : I am getting the same error today. Do you have any solution ? 2012/3/23 praveenesh kumar > Hi all, > > I am getting this following error when I am trying to do select ...with > group by operation.I am grouping on around 25 columns > > java.lang.RuntimeException: > org.ap

Re: HIVE ERROR

2012-07-25 Thread Bejoy Ks
Hi It is because of space issues. Issue 'df -h' command on the TT node that reported this error, the partition used for dfs.data.dir should be full. Regards Bejoy KS From: abhiTowson cal To: user@hive.apache.org Sent: Wednesday, July 25, 2012 9:48 PM Subjec

Re: Hive error when running a count query.

2012-05-10 Thread shashwat shriparv
When ever you execute any query except "select * from tablename" hive runs mapreduce job in background for which it needs hadoop to be properly configured and proper commication between hadoop and hive. the error you specified happens when the hive not able to connect to hadoop properly here is t

Re: Hive Error on medium sized dataset

2011-01-27 Thread hadoop n00b
nuary 27, 2011 11:21 AM > > *To:* user@hive.apache.org > *Subject:* RE: Hive Error on medium sized dataset > > > > I removed the part of the SerDe that handled the arbitrary key/value pairs > and I was able to process my entire data set. Sadly the part I removed has > all the int

RE: Hive Error on medium sized dataset

2011-01-27 Thread Christopher, Pat
Subject: RE: Hive Error on medium sized dataset I removed the part of the SerDe that handled the arbitrary key/value pairs and I was able to process my entire data set. Sadly the part I removed has all the interesting data. I'll play more with the heap settings and see if that lets me proces

RE: Hive Error on medium sized dataset

2011-01-27 Thread Christopher, Pat
rrect way to set the child heap value? Thanks, Pat From: Christopher, Pat Sent: Thursday, January 27, 2011 10:27 AM To: user@hive.apache.org Subject: RE: Hive Error on medium sized dataset It will be tricky to clean up the data format as I'm operating on somewhat arbitrary key-value pairs

RE: Hive Error on medium sized dataset

2011-01-27 Thread Christopher, Pat
my mapred-site.xml: mapred.child.java.opts -Xm512M Is that how I'm supposed to do that? Thanks, Pat From: hadoop n00b [mailto:new2h...@gmail.com] Sent: Wednesday, January 26, 2011 9:09 PM To: user@hive.apache.org Subject: Re: Hive Error on medium sized dataset We typically get t

Re: Hive Error on medium sized dataset

2011-01-26 Thread hadoop n00b
We typically get this error while running complex queries on our 4-node setup when the child JVM runs out of heap size. Would be interested in what the experts have to say about this error. On Thu, Jan 27, 2011 at 7:27 AM, Ajo Fod wrote: > Any chance you can convert the data to a tab separated t

Re: Hive Error on medium sized dataset

2011-01-26 Thread Ajo Fod
Any chance you can convert the data to a tab separated text file and try the same query? It may not be the SerDe, but it may be good to isolate that away as a potential source of the problem. -Ajo. On Wed, Jan 26, 2011 at 5:47 PM, Christopher, Pat < patrick.christop...@hp.com> wrote: > Hi, > >