In addition, with Spark 2.0, we are throwing away binary compatibility
anyways so user applications will have to be recompiled.

The only argument I can see is for libraries that have already been built
on Scala 2.10 that are no longer being maintained. How big of an issue do
we think that is?

Kostas

On Thu, Mar 24, 2016 at 4:48 PM, Marcelo Vanzin <van...@cloudera.com> wrote:

> On Thu, Mar 24, 2016 at 4:46 PM, Reynold Xin <r...@databricks.com> wrote:
> > Actually it's *way* harder to upgrade Scala from 2.10 to 2.11, than
> > upgrading the JVM runtime from 7 to 8, because Scala 2.10 and 2.11 are
> not
> > binary compatible, whereas JVM 7 and 8 are binary compatible except
> certain
> > esoteric cases.
>
> True, but ask anyone who manages a large cluster how long it would
> take them to upgrade the jdk across their cluster and validate all
> their applications and everything... binary compatibility is a tiny
> drop in that bucket.
>
> --
> Marcelo
>

Reply via email to