It looks like an Scala issue. Seems like the implicit conversion to ArrayOps does not apply if the type is Array[Nothing].
Try giving a type to the empty RDD: val emptyRdd: RDD[Any] = sc.EmptyRDD emptyRdd.collect.foreach(println) // prints a line return -kr, Gerard. On Fri, Nov 14, 2014 at 11:35 AM, Deep Pradhan <pradhandeep1...@gmail.com> wrote: > Thank You Gerard. > I was trying val emptyRdd = sc.EmptyRDD. > > Yes it works but I am not able to do *emptyRdd.collect.foreach(println)* > > Thank You > > On Fri, Nov 14, 2014 at 3:58 PM, Gerard Maas <gerard.m...@gmail.com> > wrote: > >> If I remember correctly, EmptyRDD is private [spark] >> >> You can create an empty RDD using the spark context: >> >> val emptyRdd = sc.emptyRDD >> >> -kr, Gerard. >> >> >> >> On Fri, Nov 14, 2014 at 11:22 AM, Deep Pradhan <pradhandeep1...@gmail.com >> > wrote: >> >>> To get an empty RDD, I did this: >>> >>> I have an rdd with one element. I created another rdd using filter so >>> that the second rdd does not contain anything. I achieved what I wanted but >>> I want to know whether there is an efficient way to achieve this. This is a >>> very crude way of creating an empty RDD. Is there another way to do this? >>> >>> Thank you >>> >>> On Fri, Nov 14, 2014 at 3:39 PM, Deep Pradhan <pradhandeep1...@gmail.com >>> > wrote: >>> >>>> How to create an empty RDD in Spark? >>>> >>>> Thank You >>>> >>> >>> >> >