I figured this out. It's another effect of a new behavior in
2.12: Eta-expansion of zero-argument method values is deprecated
Imagine:

def f(): String = "foo"
def g(fn: () => String) = ???

g(f) works in 2.11 without warning. It generates a warning in 2.12, because
it wants you to explicitly make a function from the method reference: g(()
=> f). It will maybe be an error in 2.13.

But, this affects implicit resolution. Some of the implicits that power
SparkContext.sequenceFile() need to change to be vals of type () =>
WritableConverter[T], not methods that return WritableConverter[T].

I'm working through this and other deprecated items in 2.12 and preparing
more 2.11-compatible changes that allow these to work cleanly in 2.12.

On Fri, Sep 15, 2017 at 11:21 AM Sean Owen <so...@cloudera.com> wrote:

> I'm working on updating to Scala 2.12, and, have hit a compile error in
> Scala 2.12 that I'm strugging to design a fix to (that doesn't modify the
> API significantly). If you "./dev/change-scala-version.sh 2.12" and
> compile, you'll see errors like...
>
> [error]
> /Users/srowen/Documents/Cloudera/spark/core/src/test/scala/org/apache/spark/FileSuite.scala:100:
> could not find implicit value for parameter kcf: () =>
> org.apache.spark.WritableConverter[org.apache.hadoop.io.IntWritable]
> [error] Error occurred in an application involving default arguments.
> [error]     val output = sc.sequenceFile[IntWritable, Text](outputDir)
>
> Clearly implicit resolution changed a little bit in 2.12 somehow. I
> actually don't recall seeing this error before, so might be somehow related
> to 2.12.3, but not sure.
>
> As you can see the implicits that have always existed and been imported
> and should apply here don't seem to be found.
>
> If anyone is a Scala expert and could glance at this, you might help save
> me a lot of puzzling.
>

Reply via email to