Hi Spark-users,
 I want to submit as many spark applications as the resources permit. I am
using cluster mode on a yarn cluster.  Yarn can queue and launch these
applications without problems. The problem lies on spark-submit itself.
Spark-submit starts a jvm which could fail due to insufficient memory on
the machine where I run spark-submit if many spark-submit jvm are running.
Any suggestions on how to solve this problem? Thank you!

Reply via email to