There is no public API for writing encoders at the moment, though we are hoping to open this up in Spark 2.1.
What is not working about encoders for options? Which version of Spark are you running? This is working as I would expect? https://databricks-prod-cloudfront.cloud.databricks.com/public/4027ec902e239c93eaaa8714f173bcfc/1023043053387187/1073771007111588/2840265927289860/latest.html On Thu, Jun 16, 2016 at 8:37 AM, Richard Marscher <rmarsc...@localytics.com> wrote: > Are there any user or dev guides for writing Encoders? I'm trying to read > through the source code to figure out how to write a proper Option[T] > encoder, but it's not straightforward without deep spark-sql source > knowledge. Is it unexpected for users to need to write their own Encoders > with the availability of ExpressionEncoder.apply and the bean encoder > method? > > As additional color for the Option[T] encoder, I have tried using the > ExpressionEncoder but it does not treat nulls properly and passes them > through. I'm not sure if this is a side-effect of > https://github.com/apache/spark/pull/13425 where a beneficial change was > made to have missing parts of joins continue through as nulls instead of > the default value for the data type (like -1 for ints). But my thought is > that would then also apply for the generic Option encoder generated as it > would see a null column value and skip passing it into the Option.apply > > -- > *Richard Marscher* > Senior Software Engineer > Localytics > Localytics.com <http://localytics.com/> | Our Blog > <http://localytics.com/blog> | Twitter <http://twitter.com/localytics> | > Facebook <http://facebook.com/localytics> | LinkedIn > <http://www.linkedin.com/company/1148792?trk=tyah> >