I support dropping 2.11 support. My general logic is:

- 2.11 is EOL, and is all the more EOL in the middle of next year when
Spark 3 arrives
- I haven't heard of a critical dependency that has no 2.12 counterpart
- 2.11 users can stay on 2.4.x, which will be notionally supported
through, say, end of 2019
- Maintaining 2.11 vs 2.12 support is modestly difficult, in my
experience resolving these differences across these two versions; it's
a hassle as you need two git clones with different scala versions in
the project tags
- The project is already short on resources to support things as it is
- Dropping things is generally necessary to add new things, to keep
complexity reasonable -- like Scala 2.13 support

Maintaining a separate PR builder for 2.11 isn't so bad

On Fri, Nov 16, 2018 at 4:09 PM Marcelo Vanzin
<van...@cloudera.com.invalid> wrote:
>
> Now that the switch to 2.12 by default has been made, it might be good
> to have a serious discussion about dropping 2.11 altogether. Many of
> the main arguments have already been talked about. But I don't
> remember anyone mentioning how easy it would be to break the 2.11
> build now.
>
> For example, the following works fine in 2.12 but breaks in 2.11:
>
> java.util.Arrays.asList("hi").stream().forEach(println)
>
> We had a similar issue when we supported java 1.6 but the builds were
> all on 1.7 by default. Every once in a while something would silently
> break, because PR builds only check the default. And the jenkins
> builds, which are less monitored, would stay broken for a while.
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to