If an argument here is the ongoing build/maintenance burden I think we
should seriously consider dropping scala 2.10 in Spark 2.0. Supporting
scala 2.10 is bigger build/infrastructure burden than supporting jdk7 since
you actually have to build different artifacts and test them whereas you
can target Spark onto 1.7 and just test it on JDK8.

In addition, as others pointed out, it seems like a bigger pain to drop
support for a JDK than scala version. So if we are considering dropping
java 7, which is a breaking change on the infra side, now is also a good
time to drop Scala 2.10 support.

Kostas

P.S. I haven't heard anyone on this thread fight for Scala 2.10 support.

On Thu, Mar 24, 2016 at 2:46 PM, Marcelo Vanzin <van...@cloudera.com> wrote:

> On Thu, Mar 24, 2016 at 2:41 PM, Jakob Odersky <ja...@odersky.com> wrote:
> > You can, but since it's going to be a maintainability issue I would
> > argue it is in fact a problem.
>
> Every thing you choose to support generates a maintenance burden.
> Support 3 versions of Scala would be a huge maintenance burden, for
> example, as is supporting 2 versions of the JDK. Just note that,
> technically, we do support 2 versions of the jdk today; we just don't
> do a lot of automated testing on jdk 8 (PRs are all built with jdk 7
> AFAIK).
>
> So at the end it's a compromise. How many users will be affected by
> your choices? That's the question that I think is the most important.
> If switching to java 8-only means a bunch of users won't be able to
> upgrade, it means that Spark 2.0 will get less use than 1.x and will
> take longer to gain traction. That has other ramifications - such as
> less use means less issues might be found and the overall quality may
> suffer in the beginning of this transition.
>
> --
> Marcelo
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>

Reply via email to