If cluster runs out of memory, it seems that the executor will be restarted by 
cluster manager.


Jared, (韦煜)
Software developer
Interested in open source software, big data, Linux

________________________________
From: Ascot Moss <ascot.m...@gmail.com>
Sent: Thursday, July 28, 2016 9:48:13 AM
To: user @spark
Subject: A question about Spark Cluster vs Local Mode

Hi,

If I submit the same job to spark in cluster mode, does it mean in cluster mode 
it will be run in cluster memory pool and it will fail if it runs out of 
cluster's memory?


--driver-memory 64g \

--executor-memory 16g \

Regards

Reply via email to