For 1), this is a recurring question in this mailing list, and the answer
is: no, Spark does not support the coordination between multiple Spark
applications. Spark relies on an external resource manager, such as Yarn
and Kubernetes, to allocate resources to multiple Spark applications. For
example
I have some problems that I am looking for if there is no solution for them
(due to the current implementation) or if there is a way and I was not aware of
it.
1)
Currently, we can enable and configure dynamic resource allocation based on
below documentation.
https://spark.apache.org/docs/late