Ah, I see now what the "broken' means. Thanks, Yi.
I personally think the option 1 is the best for existing Spark users to
support the usecase you suggested above.
So, I think this decision depends on how difficult it is to implement "get
Scala lambda parameter types by reflection"
and the complexity of it's implementation.
(I'm not familiar with the 2.12 implementation, so I'm not really sure how
difficult it is)

If we cannot choose the option 1, I like the option 2 better than
adding a new API for the usecase (the option 3).

Bests,
Takeshi

On Sat, Mar 14, 2020 at 6:24 PM wuyi <yi...@databricks.com> wrote:

> Hi Takeshi, thanks for your reply.
>
> Before the broken, we only do the null check for primitive types and leave
> null value of non-primitive type to UDF itself in case it will be handled
> specifically, e.g., a UDF may return something else for null String.
>
>
>
> --
> Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

-- 
---
Takeshi Yamamuro

Reply via email to