Re: Generic types and pair RDDs

2014-04-01 Thread Daniel Siegmann
That worked, thank you both! Thanks also Aaron for the list of things I need to read up on - I hadn't heard of ClassTag before. On Tue, Apr 1, 2014 at 5:10 PM, Aaron Davidson wrote: > Koert's answer is very likely correct. This implicit definition which > converts an RDD[(K, V)] to provide Pair

Re: Generic types and pair RDDs

2014-04-01 Thread Aaron Davidson
Koert's answer is very likely correct. This implicit definition which converts an RDD[(K, V)] to provide PairRDDFunctions requires a ClassTag is available for K: https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/SparkContext.scala#L1124 To fully understand what's goi

Re: Generic types and pair RDDs

2014-04-01 Thread Koert Kuipers
import org.apache.spark.SparkContext._ import org.apache.spark.rdd.RDD import scala.reflect.ClassTag def joinTest[K: ClassTag](rddA: RDD[(K, Int)], rddB: RDD[(K, Int)]) : RDD[(K, Int)] = { rddA.join(rddB).map { case (k, (a, b)) => (k, a+b) } } On Tue, Apr 1, 2014 at 4:55 PM, Daniel