That's possible here, sure. The issue is: would you exclude Scala 2.13 support in 3.0 for this, if it were otherwise ready to go? I think it's not a hard rule that something has to be deprecated previously to be removed in a major release. The notice is helpful, sure, but there are lots of ways to provide that notice to end users. Lots of things are breaking changes in a major release. Or: deprecate in Spark 2.4.1, if desired?
On Tue, Nov 6, 2018 at 7:36 PM Wenchen Fan <cloud0...@gmail.com> wrote: > > We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 in > Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark 3.x? > > On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin <r...@databricks.com> wrote: >> >> Have we deprecated Scala 2.11 already in an existing release? --------------------------------------------------------------------- To unsubscribe e-mail: dev-unsubscr...@spark.apache.org