A naive workaround may be to transform the json4s JValue to String (using
something like compact()) and process it as String? Once you are done with
the last action, you could write it back as JValue (using something like
parse())
Thanks,
Muthu
On Wed, Sep 19, 2018 at 6:35 AM Arko Provo Mukherjee
Hello Spark Gurus,
I am running into an issue with Encoding and wanted your help.
I have a case class with a JObject in it. Ex:
*case class SomeClass(a: String, b: JObject)*
I also have an encoder for this case class:
*val encoder = Encoders.product[**SomeClass**]*
Now I am creating a DataFrame
Hello,
Is it possible to set fields as required when writing a DataFrame/DataSet
from Spark? I wasn't able to find a way to enforce a schema, and the
default schema used has a lot of fields defined as "optional", while I'd
like to set them to be "required".
Thanks!