If I recall correctly, this only affected Pyspark. There were always 2.11 and 2.12 builds of 2.4.x, but, the (single) Pyspark distro shipped with 2.12 unintentionally and that was reversed.
This comment is referring to the Scala API. In releases where Scala 2.11 and 2.12 were supported, it looks like the docs generation process used 2.12, and auto-generated this line. It's "true", but, there was also a 2.11 build. And it doesn't tell you what Pyspark has inside, which might matter a little more, although, presumably Pyspark users mostly do not care about what's going on in the JVM. It's safe to assume the Pyspark distro will probably stick on the older of two Scala versions, when two are available, as is about to be the case for Spark 3.2.0 again, which adds 2.13 support. Pyspark distro is still on 2.12. On Wed, Sep 29, 2021 at 6:58 PM Brandon Chinn <[email protected]> wrote: > Hello, > > I'm looking at this SO post: https://stackoverflow.com/a/56197399, which > says that 2.4.1 changed to Scala 2.12, then 2.4.3 changed back to Scala > 2.11, but the docs still say Scala 2.12, e.g. > https://spark.apache.org/docs/2.4.5/#downloading: > > For the Scala API, Spark 2.4.5 uses Scala 2.12 >> > > This also doesn't match behavior, as I indeed see > > Welcome to Spark version 2.4.5 > > Using Scala version 2.11.12 > > > in the Spark output. Are the docs indeed incorrect? Can they be updated? > > -- > Brandon Chinn > LeapYear Technologies (http://leapyear.io) >
