We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 in
Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark
3.x?

On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin <r...@databricks.com> wrote:

> Have we deprecated Scala 2.11 already in an existing release?
>
> On Tue, Nov 6, 2018 at 4:43 PM DB Tsai <d_t...@apple.com> wrote:
>
>> Ideally, supporting only Scala 2.12 in Spark 3 will be ideal.
>>
>> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
>> Apple, Inc
>>
>> > On Nov 6, 2018, at 2:55 PM, Felix Cheung <felixcheun...@hotmail.com>
>> wrote:
>> >
>> > So to clarify, only scala 2.12 is supported in Spark 3?
>> >
>> >
>> > From: Ryan Blue <rb...@netflix.com.invalid>
>> > Sent: Tuesday, November 6, 2018 1:24 PM
>> > To: d_t...@apple.com
>> > Cc: Sean Owen; Spark Dev List; cdelg...@apple.com
>> > Subject: Re: Make Scala 2.12 as default Scala version in Spark 3.0
>> >
>> > +1 to Scala 2.12 as the default in Spark 3.0.
>> >
>> > On Tue, Nov 6, 2018 at 11:50 AM DB Tsai <d_t...@apple.com> wrote:
>> > +1 on dropping Scala 2.11 in Spark 3.0 to simplify the build.
>> >
>> > As Scala 2.11 will not support Java 11 unless we make a significant
>> investment, if we decide not to drop Scala 2.11 in Spark 3.0, what we can
>> do is have only Scala 2.12 build support Java 11 while Scala 2.11 support
>> Java 8. But I agree with Sean that this can make the decencies really
>> complicated; hence I support to drop Scala 2.11 in Spark 3.0 directly.
>> >
>> > DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
>> Apple, Inc
>> >
>> >> On Nov 6, 2018, at 11:38 AM, Sean Owen <sro...@gmail.com> wrote:
>> >>
>> >> I think we should make Scala 2.12 the default in Spark 3.0. I would
>> >> also prefer to drop Scala 2.11 support in 3.0. In theory, not dropping
>> >> 2.11 support it means we'd support Scala 2.11 for years, the lifetime
>> >> of Spark 3.x. In practice, we could drop 2.11 support in a 3.1.0 or
>> >> 3.2.0 release, kind of like what happened with 2.10 in 2.x.
>> >>
>> >> Java (9-)11 support also complicates this. I think getting it to work
>> >> will need some significant dependency updates, and I worry not all
>> >> will be available for 2.11 or will present some knotty problems. We'll
>> >> find out soon if that forces the issue.
>> >>
>> >> Also note that Scala 2.13 is pretty close to release, and we'll want
>> >> to support it soon after release, perhaps sooner than the long delay
>> >> before 2.12 was supported (because it was hard!). It will probably be
>> >> out well before Spark 3.0. Cross-compiling for 3 Scala versions sounds
>> >> like too much. 3.0 could support 2.11 and 2.12, and 3.1 support 2.12
>> >> and 2.13, or something. But if 2.13 support is otherwise attainable at
>> >> the release of Spark 3.0, I wonder if that too argues for dropping
>> >> 2.11 support.
>> >>
>> >> Finally I'll say that Spark itself isn't dropping 2.11 support for a
>> >> while, no matter what; it still exists in the 2.4.x branch of course.
>> >> People who can't update off Scala 2.11 can stay on Spark 2.x, note.
>> >>
>> >> Sean
>> >>
>> >>
>> >> On Tue, Nov 6, 2018 at 1:13 PM DB Tsai <d_t...@apple.com> wrote:
>> >>>
>> >>> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the
>> next Spark version will be 3.0, so it's a great time to discuss should we
>> make Scala 2.12 as default Scala version in Spark 3.0.
>> >>>
>> >>> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's
>> unlikely to support JDK 11 in Scala 2.11 unless we're willing to sponsor
>> the needed work per discussion in Scala community,
>> https://github.com/scala/scala-dev/issues/559#issuecomment-436160166
>> >>>
>> >>> We have initial support of Scala 2.12 in Spark 2.4. If we decide to
>> make Scala 2.12 as default for Spark 3.0 now, we will have ample time to
>> work on bugs and issues that we may run into.
>> >>>
>> >>> What do you think?
>> >>>
>> >>> Thanks,
>> >>>
>> >>> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
>> Apple, Inc
>> >>>
>> >>>
>> >>> ---------------------------------------------------------------------
>> >>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>> >>>
>> >
>> >
>> >
>> > --
>> > Ryan Blue
>> > Software Engineer
>> > Netflix
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>>

Reply via email to