Maybe so; I think we have a ticket open to update to 2.10.6, which
maybe fixes it.

It brings up a different point: supporting multiple Scala versions is
much more painful than Java versions because of mutual
incompatibility. Right now I get the sense there's an intent to keep
supporting 2.10, and 2.11, and 2.12 later in Spark 2. This seems like
relatively way more trouble. In the same breath -- why not remove 2.10
support anyway? It's also EOL, 2.11 also brought big improvements,
etc.

On Thu, Mar 24, 2016 at 9:04 AM, Reynold Xin <r...@databricks.com> wrote:
> I actually talked quite a bit today with an engineer on the scala compiler
> team tonight and the scala 2.10 + java 8 combo should be ok. The latest
> Scala 2.10 release should have all the important fixes that are needed for
> Java 8.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to