Hi All,

I have spark on yarn and there are multiple spark jobs on the cluster.
Sometimes some jobs are not getting enough resources even when there are
enough free resources available on cluster, even when I use below settings 

--num-workers 75 \
--worker-cores 16

Jobs stick with the resources what they get when job started.

Do we need to look at any other configs ? can some one give pointers on this
issue.

Thanks




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/resource-allocation-spark-on-yarn-tp20664.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to