RE: unknown issue in submitting a spark job

2015-01-30 Thread Sean Owen
park submit will fail due to not enough memory for rdd. > > Ey-Chih Chow > > -- > From: moham...@glassbeam.com > To: eyc...@hotmail.com; user@spark.apache.org > Subject: RE: unknown issue in submitting a spark job > Date: Fri, 30 Jan 2015 00:

RE: unknown issue in submitting a spark job

2015-01-29 Thread Mohammed Guller
I meant memory assigned to each worker. Mohammed From: ey-chih chow [mailto:eyc...@hotmail.com] Sent: Thursday, January 29, 2015 5:15 PM To: Mohammed Guller; user@spark.apache.org Subject: RE: unknown issue in submitting a spark job I use the default value, which I think is 512MB. If I change

RE: unknown issue in submitting a spark job

2015-01-29 Thread ey-chih chow
I use the default value, which I think is 512MB. If I change to 1024MB, Spark submit will fail due to not enough memory for rdd. Ey-Chih Chow From: moham...@glassbeam.com To: eyc...@hotmail.com; user@spark.apache.org Subject: RE: unknown issue in submitting a spark job Date: Fri, 30 Jan 2015 00

RE: unknown issue in submitting a spark job

2015-01-29 Thread Mohammed Guller
How much memory are you assigning to the Spark executor on the worker node? Mohammed From: ey-chih chow [mailto:eyc...@hotmail.com] Sent: Thursday, January 29, 2015 3:35 PM To: Mohammed Guller; user@spark.apache.org Subject: RE: unknown issue in submitting a spark job The worker node has 15G

RE: unknown issue in submitting a spark job

2015-01-29 Thread ey-chih chow
g > Subject: RE: unknown issue in submitting a spark job > Date: Thu, 29 Jan 2015 21:16:13 + > > Looks like the application is using a lot more memory than available. Could > be a bug somewhere in the code or just underpowered machine. Hard to say > without lookin

RE: unknown issue in submitting a spark job

2015-01-29 Thread Mohammed Guller
Looks like the application is using a lot more memory than available. Could be a bug somewhere in the code or just underpowered machine. Hard to say without looking at the code. Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded Mohammed -Original Message- From: ey-chih

Re: unknown issue in submitting a spark job

2015-01-29 Thread Arush Kharbanda
Hi There are 2 ways to resolve the issue. 1.Increasing the heap size, via "-Xmx1024m" (or more), or 2.Disabling the error check altogether, via "-XX:-UseGCOverheadLimit". as per http://stackoverflow.com/questions/5839359/java-lang-outofmemoryerror-gc-overhead-limit-exceeded you can pass the jav