park submit will fail due to not enough memory for rdd.
>
> Ey-Chih Chow
>
> --
> From: moham...@glassbeam.com
> To: eyc...@hotmail.com; user@spark.apache.org
> Subject: RE: unknown issue in submitting a spark job
> Date: Fri, 30 Jan 2015 00:
I meant memory assigned to each worker.
Mohammed
From: ey-chih chow [mailto:eyc...@hotmail.com]
Sent: Thursday, January 29, 2015 5:15 PM
To: Mohammed Guller; user@spark.apache.org
Subject: RE: unknown issue in submitting a spark job
I use the default value, which I think is 512MB. If I change
I use the default value, which I think is 512MB. If I change to 1024MB, Spark
submit will fail due to not enough memory for rdd.
Ey-Chih Chow
From: moham...@glassbeam.com
To: eyc...@hotmail.com; user@spark.apache.org
Subject: RE: unknown issue in submitting a spark job
Date: Fri, 30 Jan 2015 00
How much memory are you assigning to the Spark executor on the worker node?
Mohammed
From: ey-chih chow [mailto:eyc...@hotmail.com]
Sent: Thursday, January 29, 2015 3:35 PM
To: Mohammed Guller; user@spark.apache.org
Subject: RE: unknown issue in submitting a spark job
The worker node has 15G
g
> Subject: RE: unknown issue in submitting a spark job
> Date: Thu, 29 Jan 2015 21:16:13 +
>
> Looks like the application is using a lot more memory than available. Could
> be a bug somewhere in the code or just underpowered machine. Hard to say
> without lookin
-chih chow [mailto:eyc...@hotmail.com]
Sent: Thursday, January 29, 2015 1:06 AM
To: user@spark.apache.org
Subject: unknown issue in submitting a spark job
Hi,
I submitted a job using spark-submit and got the following exception.
Anybody knows how to fix this? Thanks.
Ey-Chih Chow
Recipient[Actor[akka://sparkDriver/user/BlockManagerMaster#-538003375]] had
> already been terminated.
> at
> akka.pattern.AskableActorRef$.ask$extension(AskSupport.scala:134)
> at
> org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:175)
> at
>
> org.apache.spar
om/unknown-issue-in-submitting-a-spark-job-tp21418.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-ma