Hi, Koert.
We know, welcome, and believe it. However, it's only Scala community's
roadmap so far. It doesn't mean Apache Spark supports Scala 3 officially.
For example, Apache Spark 3.0.1 supports Scala 2.12.10 but not 2.12.12 due
to Scala issue.
In Apache Spark community, we had better focus on
i have gotten used to spark always returning a WrappedArray for Seq. at
some point i think i even read this was guaranteed to be the case. not sure
if it still is...
in spark 3.0.1 with scala 2.12 i get a WrappedArray as expected:
scala> val x = Seq((1,2),(1,3)).toDF
x: org.apache.spark.sql.DataF
i think scala 3.0 will be able to use libraries built with Scala 2.13 (as
long as they dont use macros)
see:
https://www.scala-lang.org/2019/12/18/road-to-scala-3.html
On Sun, Oct 18, 2020 at 9:54 AM Sean Owen wrote:
> Spark depends on a number of Scala libraries, so needs them all to support
>
Hi, Denis
We are currently moving toward Scala 3 together by focusing on completion
SPARK-25075 first as a stepping stone.
https://issues.apache.org/jira/browse/SPARK-25075
Build and test Spark against Scala 2.13
We didn't finish it yet. We need to have Jenkins jobs with Scala 2.13.
Also
Thanks for the input, Sean.
> Spark depends on a number of Scala libraries, so needs them all to support
> version X before Spark can. This only happened for 2.13 about 4-5 months
> ago. I wonder if even a fraction of the necessary libraries have 3.0
> support yet?
As far as I understand, here sh
Spark depends on a number of Scala libraries, so needs them all to support
version X before Spark can. This only happened for 2.13 about 4-5 months
ago. I wonder if even a fraction of the necessary libraries have 3.0
support yet?
It can be difficult to test and support multiple Scala versions
simu
Sorry for the noise.
Please reply to this one.
--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org