Hello,
by default, some packages (that are treated as internal) are excluded from
documentation generation task. To generate Javadoc/Scaladoc for classes from
them, you would need to comment relevant line in build definition file.
For example. package `org/apache/spark/sql/execution` is mentioned
Hi all!
I'd like to ask for an opinion and discuss the next thing:
at this moment in general Spark could be built with Scala 2.11 and 2.12
(mostly), and close to the point to have support for Scala 2.13. On the
other hand, Scala 3 is going into the pre-release phase (with 3.0.0-M1
released at the
Sorry for the noise.
Please reply to this one.
--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
Thanks for the input, Sean.
> Spark depends on a number of Scala libraries, so needs them all to support
> version X before Spark can. This only happened for 2.13 about 4-5 months
> ago. I wonder if even a fraction of the necessary libraries have 3.0
> support yet?
As far as I understand, here sh