Of course, we can make Apache Spark distribution bigger and bigger, but I'm
a little neutral about Volcano.
In any way, I'd like to say that the root cause of the difference is those
scheduler designs instead of Apache Spark itself. For example, Apache
YuniKorn doesn't force us to add a new depend
I found some of the notes on Volcano and my tests back in Feb 2022. I did
my volcano tests on Spark 3.1.1. The results were not very great then.
Hence I asked in thread from @santosh, if any updated comparisons are
available. I will try the test with Spark 3.4.1 at some point. Maybe some
users have
@Santosh
We tried to add this in v3.3.0. [1] The main reason for not adding it at
that time was:
1. Volcano multi-arch not supported before v1.7.0. (already upgraded to
1.7.0 since Spark 3.4.0)
2. Spark on K8s + Volcano is experimental. (We have removed the
experimental [2])
Consider spark volcan
Hi Santosh,
We had a Google team discussion about k8s back in February and it was
mentioned then.
My personal experience with Volcano was not that impressive. Do you have
some stats to prove that it is worth adding as an addition.
Anyone else is welcome to comment.
HTH
Mich Talebzadeh,
Distin
Hey all
It would useful to support volcano in spark distro itself just like yunikorn.
So I am wondering what is the reason behind this decision of not packaging it
already.
https://spark.apache.org/docs/latest/running-on-kubernetes.html#using-volcano-as-customized-scheduler-for-spark-on-kubernet
We are happy to announce the availability of Apache Spark 3.3.3!
Spark 3.3.3 is a maintenance release containing stability fixes. This
release is based on the branch-3.3 maintenance branch of Spark. We strongly
recommend all 3.3 users to upgrade to this stable release.
To download Spark 3.3.3, he