Hello Spark community -

Does Spark 2.0 Datasets *not support* Scala Value classes (basically
"extends AnyVal" with a bunch of limitations) ?

I am trying to do something like this:

case class FeatureId(value: Int) extends AnyVal
val seq = Seq(FeatureId(1),FeatureId(2),FeatureId(3))
import spark.implicits._
val ds = spark.createDataset(seq)
ds.count


This will compile, but then it will break at runtime with a cryptic error
about "cannot find int at value". If I remove the "extends AnyVal" part,
then everything works.

Value classes are a great performance boost / static type checking feature
in Scala, but are they prohibited in Spark Datasets?

Thanks!

Reply via email to