We can do the following concrete proposal:

1. Plan to remove support for Java 7 / Scala 2.10 in Spark 2.2.0 (Mar/Apr
2017).

2. In Spark 2.1.0 release, aggressively and explicitly announce the
deprecation of Java 7 / Scala 2.10 support.

(a) It should appear in release notes, documentations that mention how to
build Spark

(b) and a warning should be shown every time SparkContext is started using
Scala 2.10 or Java 7.



On Wed, Oct 26, 2016 at 7:50 PM, Dongjoon Hyun <dongj...@apache.org> wrote:

> Hi, Daniel.
>
> I guess that kind of works will start sufficiently in 2.1.0 after PMC's
> annoucement/reminder on mailing list.
>
> Bests,
> Dongjoon.
>
>
> On Wednesday, October 26, 2016, Daniel Siegmann <
> dsiegm...@securityscorecard.io> wrote:
>
>> Is the deprecation of JDK 7 and Scala 2.10 documented anywhere outside
>> the release notes for Spark 2.0.0? I do not consider release notes to be
>> sufficient public notice for deprecation of supported platforms - this
>> should be noted in the documentation somewhere. Here are on the only
>> mentions I could find:
>>
>> At http://spark.apache.org/downloads.html it says:
>>
>> "*Note: Starting version 2.0, Spark is built with Scala 2.11 by default.
>> Scala 2.10 users should download the Spark source package and build with
>> Scala 2.10 support
>> <http://spark.apache.org/docs/latest/building-spark.html#building-for-scala-210>."*
>>
>> At http://spark.apache.org/docs/latest/#downloading it says:
>>
>> "Spark runs on Java 7+, Python 2.6+/3.4+ and R 3.1+. For the Scala API,
>> Spark 2.0.1 uses Scala 2.11. You will need to use a compatible Scala
>> version (2.11.x)."
>>
>> At http://spark.apache.org/docs/latest/programming-guide.html#l
>> inking-with-spark it says:
>>
>>    - "Spark 2.0.1 is built and distributed to work with Scala 2.11 by
>>    default. (Spark can be built to work with other versions of Scala, too.) 
>> To
>>    write applications in Scala, you will need to use a compatible Scala
>>    version (e.g. 2.11.X)."
>>    - "Spark 2.0.1 works with Java 7 and higher. If you are using Java 8,
>>    Spark supports lambda expressions
>>    
>> <http://docs.oracle.com/javase/tutorial/java/javaOO/lambdaexpressions.html>
>>    for concisely writing functions, otherwise you can use the classes in the
>>    org.apache.spark.api.java.function
>>    
>> <http://spark.apache.org/docs/latest/api/java/index.html?org/apache/spark/api/java/function/package-summary.html>
>>    package."
>>    - "Spark 2.0.1 works with Python 2.6+ or Python 3.4+. It can use the
>>    standard CPython interpreter, so C libraries like NumPy can be used. It
>>    also works with PyPy 2.3+."
>>
>>
>>

Reply via email to