Let me add my answers about a few Scala questions, Jungtaek.

> Are we concerned that a library does not release a new version
> which bumps the Scala version, which the Scala version is
> announced in less than a week?

No, we have concerns about the newly introduced disability
in the Apache Spark Scala environment.



> Shall we respect the efforts of all maintainers of open source projects
> we use as dependencies, regardless whether they are ASF projects or
> individuals?

Not only respecting all the efforts, but also Yang Jie and I've been
participating in those individual projects to help them and us.
I believe that we've aimed our best collaboration there.


> Bumping a bugfix version is not always safe,
> especially for Scala where they use semver as one level down
> their minor version is almost another's major version
> (similar amount of pain on upgrading).

I agree with you in two ways.

1. Before adding Ammonite dependency, Apache Spark community itself was one
of the major Scala users who participated in new version testing and we
gave active feedback to the Scala community. In addition, we decide whether
to consume it or not by ourselves. Now, the Apache Spark community has lost
our ability to consume it because it fails at the dependency downloading
step. We are waiting because we don't have an alternative. That's a big
difference; to be able or not.

2. Again, I must reiterate that that's one of the reasons why I reported an
issue, "There is a company claiming something non-Apache like "Apache Spark
3.4.0 minus SPARK-40436" with the name "Apache Spark 3.4.0."


Dongjoon.

Reply via email to