Hi All,

I am a newbie to spark and have a quick question.
I am running Spark 2.3.2 on YARN using HDP 2.8.5 on EMR -5.19.0

Since EMR version is 5.19, dynamic allocation is set to true by default. I
haven't set the min and max executors but saw that by default it starts
with max( initial executors, min executors and spark.executor.instances).
Since I haven't configured anything, it starts with
spark.executor.instances.
My question here is how does it (dynamic allocation) decide the number of
spark executors and cores if I haven't specified anything explicitly (I
have seen it varying in different runs)

(Note: I am using spot instance fleet here)

Thanks,
Pooja

Reply via email to