works perfectly! Thanks Herman.
Am Di., 25. Aug. 2020 um 12:03 Uhr schrieb Herman van Hovell <
her...@databricks.com>:
> Hi Robert,
>
> Your Spark 3.0 code is missing the encoder that converts the Row to an
> InternalRow. Your Spark 3.0 code should look like this:
>
> def rowToCaseClass[C <: Prod
Hi everyone
Thanks Takeshi. I run into the same issue as Mark for my row to case class
converter:
def rowToCaseClass[C <: Product : TypeTag](r: Row)(implicit encs:
(ExpressionEncoder[Row], ExpressionEncoder[C])): C = {
val ir = encs._1.toRow(r)
encs._2.fromRow(ir)
}
So in Spark3.0 I would
Hi,
Have you tried it like this?
--
{ r: InternalRow => enc1.fromRow(r) }
===>
{ r: InternalRow =>
val fromRow = enc1.createDeserializer()
fromRow(r)
}
https://github.com/apache/spark/commit/e7fef70fbbea08a38316abdaa9445123bb8c39e2
Bests,
Takeshi
On Thu, Aug 20, 2020 at 1:52 PM Mark