Re: Spark 1.5.2 memory error

2016-02-03 Thread Nirav Patel
loves the type safety it provides). Not even >>>> sure if changing to DataFrame will for sure solve the issue. >>>> >>>> On Wed, Feb 3, 2016 at 1:33 PM, Mohammed Guller >>> > wrote: >>>> >>>>> Nirav, >>>>

Re: Spark 1.5.2 memory error

2016-02-03 Thread Ted Yu
;>>> >>>> Sorry to hear about your experience with Spark; however, sucks is a >>>> very strong word. Many organizations are processing a lot more than 150GB >>>> of data with Spark. >>>> >>>> >>>> >>>> Mohammed >

Re: Spark 1.5.2 memory error

2016-02-03 Thread Jerry Lam
actitioners/dp/1484209656/> >>> >>> >>> >>> *From:* Nirav Patel [mailto:npa...@xactlycorp.com] >>> *Sent:* Wednesday, February 3, 2016 11:31 AM >>> *To:* Stefan Panayotov >>> *Cc:* Jim Green; Ted Yu; Jakob Odersky; user@spark.ap

Re: Spark 1.5.2 memory error

2016-02-03 Thread Nirav Patel
ord. Many organizations are processing a lot more than 150GB of >>> data with Spark. >>> >>> >>> >>> Mohammed >>> >>> Author: Big Data Analytics with Spark >>> <http://www.amazon.com/Big-Data-Analytics-Spark-Practitio

Re: Spark 1.5.2 memory error

2016-02-03 Thread Rishabh Wadhawan
rp.com > <mailto:npa...@xactlycorp.com>] > Sent: Wednesday, February 3, 2016 11:31 AM > To: Stefan Panayotov > Cc: Jim Green; Ted Yu; Jakob Odersky; user@spark.apache.org > <mailto:user@spark.apache.org> > > Subject: Re: Spark 1.5.2 memory error > > &g

Re: Spark 1.5.2 memory error

2016-02-03 Thread Nirav Patel
>> Mohammed >> >> Author: Big Data Analytics with Spark >> <http://www.amazon.com/Big-Data-Analytics-Spark-Practitioners/dp/1484209656/> >> >> >> >> *From:* Nirav Patel [mailto:npa...@xactlycorp.com] >> *Sent:* Wednesday, February 3, 2016

Re: Spark 1.5.2 memory error

2016-02-03 Thread Nirav Patel
tp://www.amazon.com/Big-Data-Analytics-Spark-Practitioners/dp/1484209656/> > > > > *From:* Nirav Patel [mailto:npa...@xactlycorp.com] > *Sent:* Wednesday, February 3, 2016 11:31 AM > *To:* Stefan Panayotov > *Cc:* Jim Green; Ted Yu; Jakob Odersky; user@spark.apac

RE: Spark 1.5.2 memory error

2016-02-03 Thread Mohammed Guller
656/> From: Nirav Patel [mailto:npa...@xactlycorp.com] Sent: Wednesday, February 3, 2016 11:31 AM To: Stefan Panayotov Cc: Jim Green; Ted Yu; Jakob Odersky; user@spark.apache.org Subject: Re: Spark 1.5.2 memory error Hi Stefan, Welcome to the OOM - heap space club. I have been struggling with s

Re: Spark 1.5.2 memory error

2016-02-03 Thread Rishabh Wadhawan
_01_01: 319.8 MB of 1.5 GB > physical memory used; 1.7 GB of 3.1 GB virtual memory used > 2016-02-03 17:33:22,627 INFO nodemanager.NodeStatusUpdaterImpl > (NodeStatusUpdaterImpl.java:removeOrTrackCompletedContainersFromContext(529)) > - Removed completed containers from NM conte

Re: Spark 1.5.2 memory error

2016-02-03 Thread Nirav Patel
terImpl.java:removeOrTrackCompletedContainersFromContext(529)) > - Removed completed containers from NM context: > [container_1454509557526_0014_01_93] > > I'll appreciate any suggestions. > > Thanks, > > > *Stefan Panayotov, PhD **Home*: 610-355-0919 > *Cell*: 610-517-5586 > *email*:

RE: Spark 1.5.2 memory error

2016-02-03 Thread Stefan Panayotov
, 2016 4:52 PM To: Jakob Odersky Cc: Stefan Panayotov; user@spark.apache.org Subject: Re: Spark 1.5.2 memory error What value do you use for spark.yarn.executor.memoryOverhead ? Please see https://spark.apache.org/docs/latest/running-on-yarn.html for description of the parameter. Which Spark rel

Re: Spark 1.5.2 memory error

2016-02-02 Thread Jim Green
the default of 10% of 16g, and Spark version > is 1.5.2. > > > > Stefan Panayotov, PhD > Sent from Outlook Mail for Windows 10 phone > > > > > *From: *Ted Yu > *Sent: *Tuesday, February 2, 2016 4:52 PM > *To: *Jakob Odersky > *Cc: *Stefan Panayotov ; user@

RE: Spark 1.5.2 memory error

2016-02-02 Thread Stefan Panayotov
For the memoryOvethead I have the default of 10% of 16g, and Spark version is 1.5.2. Stefan Panayotov, PhD Sent from Outlook Mail for Windows 10 phone From: Ted Yu Sent: Tuesday, February 2, 2016 4:52 PM To: Jakob Odersky Cc: Stefan Panayotov; user@spark.apache.org Subject: Re: Spark 1.5.2

Re: Spark 1.5.2 memory error

2016-02-02 Thread Ted Yu
What value do you use for spark.yarn.executor.memoryOverhead ? Please see https://spark.apache.org/docs/latest/running-on-yarn.html for description of the parameter. Which Spark release are you using ? Cheers On Tue, Feb 2, 2016 at 1:38 PM, Jakob Odersky wrote: > Can you share some code that

Re: Spark 1.5.2 memory error

2016-02-02 Thread Jakob Odersky
Can you share some code that produces the error? It is probably not due to spark but rather the way data is handled in the user code. Does your code call any reduceByKey actions? These are often a source for OOM errors. On Tue, Feb 2, 2016 at 1:22 PM, Stefan Panayotov wrote: > Hi Guys, > > I need