Thanks for the input, Sean. > Spark depends on a number of Scala libraries, so needs them all to support > version X before Spark can. This only happened for 2.13 about 4-5 months > ago. I wonder if even a fraction of the necessary libraries have 3.0 > support yet?
As far as I understand, here shows off one of the differences, we could use Scala 2.13 binary dependencies in Scala 3 project: it was expected that all the 3d-party libraries would lag behind compiler and standard library for significant amount of time, so it is "retro-compatible" in their terms [1][2]. And that's why I was able to see actual compilation errors using all dependency set from Spark with Scala 2.13 [1] https://contributors.scala-lang.org/t/scala-2-to-3-transition-some-updates-from-the-scala-center/4013 [2] https://scalacenter.github.io/scala-3-migration-guide/docs/compatibility.html#a-scala-3-module-depending-on-a-scala-2-artifact I believe this changes the perspective a bit: that only code changes are left to consider on how much to divert from current state of the sources and is it possible to maintain it in the same source tree. -- Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/ --------------------------------------------------------------------- To unsubscribe e-mail: dev-unsubscr...@spark.apache.org