Generally the 22 limitation is from Scala 2.10.

In Scala 2.11, the issue with case class is fixed, but with that said I’m not 
sure if with UDF in Java other limitation might apply.

_____________________________
From: Aakash Basu <aakash.spark....@gmail.com>
Sent: Monday, December 25, 2017 9:13 PM
Subject: Re: Passing an array of more than 22 elements in a UDF
To: Felix Cheung <felixcheun...@hotmail.com>
Cc: ayan guha <guha.a...@gmail.com>, user <user@spark.apache.org>


What's the privilege of using that specific version for this? Please throw some 
light onto it.

On Mon, Dec 25, 2017 at 6:51 AM, Felix Cheung 
<felixcheun...@hotmail.com<mailto:felixcheun...@hotmail.com>> wrote:
Or use it with Scala 2.11?

________________________________
From: ayan guha <guha.a...@gmail.com<mailto:guha.a...@gmail.com>>
Sent: Friday, December 22, 2017 3:15:14 AM
To: Aakash Basu
Cc: user
Subject: Re: Passing an array of more than 22 elements in a UDF

Hi I think you are in correct track. You can stuff all your param in a suitable 
data structure like array or dict and pass this structure as a single param in 
your udf.

On Fri, 22 Dec 2017 at 2:55 pm, Aakash Basu 
<aakash.spark....@gmail.com<mailto:aakash.spark....@gmail.com>> wrote:
Hi,

I am using Spark 2.2 using Java, can anyone please suggest me how to take more 
than 22 parameters in an UDF? I mean, if I want to pass all the parameters as 
an array of integers?

Thanks,
Aakash.
--
Best Regards,
Ayan Guha



Reply via email to