I don't think it's possible to get the parameters by reflection
anymore -- they are lambdas now in the JVM. At least, indeed, I recall
a few people couldn't find a solution back when we added 2.12 support.
This isn't 'new' in that it has always been the case for Scala 2.12.
If there is a better idea, sure.

On Sat, Mar 14, 2020 at 5:50 AM Takeshi Yamamuro <linguin....@gmail.com> wrote:
>
> Ah, I see now what the "broken' means. Thanks, Yi.
> I personally think the option 1 is the best for existing Spark users to 
> support the usecase you suggested above.
> So, I think this decision depends on how difficult it is to implement "get 
> Scala lambda parameter types by reflection"
> and the complexity of it's implementation.
> (I'm not familiar with the 2.12 implementation, so I'm not really sure how 
> difficult it is)
>
> If we cannot choose the option 1, I like the option 2 better than
> adding a new API for the usecase (the option 3).
>
> Bests,
> Takeshi
>
> On Sat, Mar 14, 2020 at 6:24 PM wuyi <yi...@databricks.com> wrote:
>>
>> Hi Takeshi, thanks for your reply.
>>
>> Before the broken, we only do the null check for primitive types and leave
>> null value of non-primitive type to UDF itself in case it will be handled
>> specifically, e.g., a UDF may return something else for null String.
>>
>>
>>
>> --
>> Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>
>
> --
> ---
> Takeshi Yamamuro

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to