Re: [Worker Crashing] OutOfMemoryError: GC overhead limit execeeded

2017-03-24 Thread Yong Zhang
apache.org Subject: Re: [Worker Crashing] OutOfMemoryError: GC overhead limit execeeded Yea we also didn't find anything related to this online. Are you aware of any memory leaks in worker in 1.6.2 spark which might be causing this ? Do you know of any documentation which explains all the tasks t

Re: [Worker Crashing] OutOfMemoryError: GC overhead limit execeeded

2017-03-24 Thread Behroz Sikander
t; *Sent:* Friday, March 24, 2017 9:15 AM > *To:* Yong Zhang > *Cc:* user@spark.apache.org > *Subject:* Re: [Worker Crashing] OutOfMemoryError: GC overhead limit > execeeded > > Thank you for the response. > > Yes, I am sure because the driver was working fine. Only 2 workers

Re: [Worker Crashing] OutOfMemoryError: GC overhead limit execeeded

2017-03-24 Thread Yong Zhang
: [Worker Crashing] OutOfMemoryError: GC overhead limit execeeded Thank you for the response. Yes, I am sure because the driver was working fine. Only 2 workers went down with OOM. Regards, Behroz On Fri, Mar 24, 2017 at 2:12 PM, Yong Zhang mailto:java8...@hotmail.com>> wrote: I am not 100% sur

Re: [Worker Crashing] OutOfMemoryError: GC overhead limit execeeded

2017-03-24 Thread Behroz Sikander
re you sure your workers OOM? > > > Yong > > > -- > *From:* bsikander > *Sent:* Friday, March 24, 2017 5:48 AM > *To:* user@spark.apache.org > *Subject:* [Worker Crashing] OutOfMemoryError: GC overhead limit execeeded > > Spark version: 1.6.2

Re: [Worker Crashing] OutOfMemoryError: GC overhead limit execeeded

2017-03-24 Thread Yong Zhang
How can we avoid that in future. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Worker-Crashing-OutOfMemoryError-GC-overhead-limit-execeeded-tp28535.html Sent from the Apache Spark User List mailing list archive at Nabble.com. -

[Worker Crashing] OutOfMemoryError: GC overhead limit execeeded

2017-03-24 Thread bsikander
ry ? How can we avoid that in future. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Worker-Crashing-OutOfMemoryError-GC-overhead-limit-execeeded-tp28535.html Sent from the Apache Spark User List mailing list archiv

[Worker Crashing] OutOfMemoryError: GC overhead limit execeeded

2017-03-23 Thread Behroz Sikander
Hello, Spark version: 1.6.2 Hadoop: 2.6.0 Cluster: All VMS are deployed on AWS. 1 Master (t2.large) 1 Secondary Master (t2.large) 5 Workers (m4.xlarge) Zookeeper (t2.large) Recently, 2 of our workers went down with out of memory exception. > java.lang.OutOfMemoryError: GC overhead limit exceeded