I am running spark application on yarn managed cluster.

When I specify --executor-cores > 4 it fails to start the application.
I am starting the app as

spark-submit --class classname --num-executors 10 --executor-cores
5 --master masteradd jarname

Exception in thread "main" org.apache.spark.SparkException: Yarn
application has already ended! It might have been killed or unable to
launch application master.

When I give --executor-cores as 4 , it works fine.

My Cluster has 10 nodes .
Why am I not able to specify more than 4 concurrent tasks. Is there any max
limit yarn side or spark side which I can override to make use of more
tasks ?

Reply via email to