Hi,
I am trying to write a new aggregate function 
(https://issues.apache.org/jira/browse/SPARK-17691) and I wanted it to support 
all ordered types.
I have several  issues though:

1.       How to convert the type of the child expression to a Scala standard 
type (e.g. I need an Array[Int] for IntegerType and an Array[Double] for 
DoubleType). The only method I found so far is to do a match for each of the 
types. Is there a better way?

2.       What would be the corresponding scala type for DecimalType, 
TimestampType, DateType and BinaryType?

3.       Should BinaryType be a legal type for such a function?

4.       I need to serialize the relevant array of type (i.e. turn it into an 
Array[Byte] for working with TypedImperativeAggregate). I can do a match for 
standard types such as Int and Double but I do not know of a generic way to do 
it.
Thanks,
                Assaf.




--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/Converting-spark-types-and-standard-scala-types-tp19552.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

Reply via email to