Hi,

I'm wondering why YARN computes the initial number of executors (in
YarnSparkHadoopUtil.getInitialTargetExecutorNumber [1]) if Core's
Utils.getDynamicAllocationInitialExecutors [2] could do?

I'm to send a PR to remove the duplication as it's tricky enough to
keep right in one place given all the Spark properties and their
relationship:

* spark.dynamicAllocation.minExecutors
* spark.dynamicAllocation.initialExecutors
* spark.executor.instances

WDYT?

[1] 
https://github.com/apache/spark/blob/master/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnSparkHadoopUtil.scala#L270

[2] 
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/util/Utils.scala#L2516

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 https://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to