Dear Spark team,

I'm using the EC2 script to startup a Spark cluster. If I login and use the
executor-memory parameter in the submit script, the UI tells me that no
cores are assigned to the job and nothing happens. Without executor-memory
everything works fine... Until I get "dag-scheduler-event-loop"
java.lang.OutOfMemoryError: Java heap space, but that's another issue.

./bin/spark-submit \
  --class ... \
  --executor-memory 20G \
  /path/to/examples.jar 

Thanks.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/EC2-spark-submit-executor-memory-tp22417.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to