I'll point the Scala team to this issue, but it's unlikely to get fixed any
time soon.

dean


*Dean Wampler, Ph.D.*

*VP, Fast Data Engineering at Lightbend*
Author: Programming Scala, 2nd Edition
<http://shop.oreilly.com/product/0636920033073.do>, Fast Data Architectures
for Streaming Applications
<http://www.oreilly.com/data/free/fast-data-architectures-for-streaming-applications.csp>,
and other content from O'Reilly
@deanwampler <http://twitter.com/deanwampler>
http://polyglotprogramming.com
https://github.com/deanwampler

On Thu, Jun 7, 2018 at 4:27 PM, DB Tsai <d_t...@apple.com> wrote:

> Thanks Felix for bringing this up.
>
> Currently, in Scala 2.11.8, we initialize the Spark by overriding
> loadFIles() before REPL sees any file since there is no good hook in Scala
> to load our initialization code.
>
> In Scala 2.11.12 and newer version of the Scala 2.12.x, loadFIles() method
> was removed.
>
> Alternatively, one way we can do in the newer version of Scala is by
> overriding initializeSynchronous() suggested by Som Snytt; I have a working
> PR with this approach,
> https://github.com/apache/spark/pull/21495 , and this approach should
> work for older version of Scala too.
>
> However, in the newer version of Scala, the first thing that the REPL
> calls is printWelcome, so in the newer version of Scala, welcome message
> will be shown and then the URL of the SparkUI in this approach. This will
> cause UI inconsistencies between different versions of Scala.
>
> We can also initialize the Spark in the printWelcome which I feel more
> hacky. It will only work for newer version of Scala since in order version
> of Scala, printWelcome is called in the end of the initialization process.
> If we decide to go this route, basically users can not use Scala older than
> 2.11.9.
>
> I think this is also a blocker for us to move to newer version of Scala
> 2.12.x since the newer version of Scala 2.12.x has the same issue.
>
> In my opinion, Scala should fix the root cause and provide a stable hook
> for 3rd party developers to initialize their custom code.
>
> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
> Apple, Inc
>
> > On Jun 7, 2018, at 6:43 AM, Felix Cheung <felixcheun...@hotmail.com>
> wrote:
> >
> > +1
> >
> > Spoke to Dean as well and mentioned the problem with 2.11.12
> https://github.com/scala/bug/issues/10913
> >
> > _____________________________
> > From: Sean Owen <sro...@gmail.com>
> > Sent: Wednesday, June 6, 2018 12:23 PM
> > Subject: Re: Scala 2.12 support
> > To: Holden Karau <hol...@pigscanfly.ca>
> > Cc: Dean Wampler <deanwamp...@gmail.com>, Reynold Xin <
> r...@databricks.com>, dev <dev@spark.apache.org>
> >
> >
> > If it means no change to 2.11 support, seems OK to me for Spark 2.4.0.
> The 2.12 support is separate and has never been mutually compatible with
> 2.11 builds anyway. (I also hope, suspect that the changes are minimal;
> tests are already almost entirely passing with no change to the closure
> cleaner when built for 2.12)
> >
> > On Wed, Jun 6, 2018 at 1:33 PM Holden Karau <hol...@pigscanfly.ca>
> wrote:
> > Just chatted with Dean @ the summit and it sounds like from Adriaan
> there is a fix in 2.13 for the API change issue that could be back ported
> to 2.12 so how about we try and get this ball rolling?
> >
> > It sounds like it would also need a closure cleaner change, which could
> be backwards compatible but since it’s such a core component and we might
> want to be cautious with it, we could when building for 2.11 use the old
> cleaner code and for 2.12 use the new code so we don’t break anyone.
> >
> > How do folks feel about this?
> >
> >
> >
>
>

Reply via email to