Re: Size exceeds Integer.MAX_VALUE in BlockFetcherIterator

2014-09-17 Thread Nicholas Chammas
Which appears in turn to be caused by SPARK-1476 . On Wed, Sep 17, 2014 at 9:14 PM, francisco wrote: > Looks like this is a known issue: > > https://issues.apache.org/jira/browse/SPARK-1353 > > > > -- > View this message in context: > http://apac

Re: Size exceeds Integer.MAX_VALUE in BlockFetcherIterator

2014-09-17 Thread francisco
Looks like this is a known issue: https://issues.apache.org/jira/browse/SPARK-1353 -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Size-exceeds-Integer-MAX-VALUE-in-BlockFetcherIterator-tp14483p14500.html Sent from the Apache Spark User List mailing list ar

Re: Size exceeds Integer.MAX_VALUE in BlockFetcherIterator

2014-09-17 Thread Burak Yavuz
probably will not work giving the "exceeds Integer.MAX_VALUE" error. Best, Burak - Original Message - From: "francisco" To: u...@spark.incubator.apache.org Sent: Wednesday, September 17, 2014 3:18:29 PM Subject: Size exceeds Integer.MAX_VALUE in BlockFetcherIterator

Size exceeds Integer.MAX_VALUE in BlockFetcherIterator

2014-09-17 Thread francisco
Hi, We are running aggregation on a huge data set (few billion rows). While running the task got the following error (see below). Any ideas? Running spark 1.1.0 on cdh distribution. ... 14/09/17 13:33:30 INFO Executor: Finished task 0.0 in stage 1.0 (TID 0). 2083 bytes result sent to driver 14/09