Can you show the full stack trace (or top 10 lines) and the snippet using
your MyRDD ?

Thanks

On Sun, Mar 27, 2016 at 9:22 AM, Tenghuan He <tenghua...@gmail.com> wrote:

> ​Hi everyone,
>
>     I am creating a custom RDD which extends RDD and add a custom method,
> however the custom method cannot be found.
>     The custom RDD looks like the following:
>
> class MyRDD[K, V](
>     var base: RDD[(K, V)],
>     part: Partitioner
>   ) extends RDD[(K, V)](base.context, Nil) {
>
>   def *customMethod*(bulk: ArrayBuffer[(K, (V, Int))]): myRDD[K, V] = {
>   // ... custom code here
>   }
>
>   override def compute(split: Partition, context: TaskContext):
> Iterator[(K, V)] = {
>   // ... custome code here
>   }
>
>   override protected def getPartitions: Array[Partition] = {
>   // ... custom code here
>   }
>
>   override protected def getDependencies: Seq[Dependency[_]] = {
>   // ... custom code here
>   }
> }​
>
> In spark-shell, it turns out that the overrided methods works well, but
> when calling myrdd.customMethod(bulk), it throws out:
> <console>:33: error: value customMethod is not a member of
> org.apache.spark.rdd.RDD[(In
> t, String)]
>
> Can anyone tell why the custom method can not be found?
> Or do I have to add the customMethod to the abstract RDD and then override
> it in custom RDD?
>
> PS: spark-version: 1.5.1
>
> Thanks & Best regards
>
> Tenghuan
>
>
>

Reply via email to