Thanks Sean - yup, I was having issues with Scala 2.12 for some stuff, so I
kept 2.11...
Casting works. Makes the code a little ugly, but… It’s definitely a Scala 2.12
vs. 2.11, not a Spark 3 specifically.
jg
> On Dec 28, 2019, at 1:15 PM, Sean Owen wrote:
>
> Yes, it's necessary to cast the
Yes, it's necessary to cast the lambda in Java as (MapFunction)
in many cases. This is because the Scala-specific and Java-specific
versions of .map() both end up accepting a function object that the
lambda can match, and an Encoder. What I'd have to go back and look up
is why that would be differe
I forgot… it does the same thing with the reducer…
int dartsInCircle = dotsDs.reduce((x, y) -> x + y);
jg
> On Dec 28, 2019, at 12:38 PM, Jean-Georges Perrin wrote:
>
> Hey guys,
>
> This code:
>
> Dataset incrementalDf = spark
> .createDataset(l, Encoders.INT())
> .t
Hey guys,
This code:
Dataset incrementalDf = spark
.createDataset(l, Encoders.INT())
.toDF();
Dataset dotsDs = incrementalDf
.map(status -> {
double x = Math.random() * 2 - 1;
double y = Math.random() * 2 - 1;
counter++;
if (