Re: java.lang.NullPointerException met when computing new RDD or use .count

2014-03-17 Thread Ian O'Connell
I'm guessing the other result was wrong, or just never evaluated here. The RDD transforms being lazy may have let it be expressed, but it wouldn't work. Nested RDD's are not supported. On Mon, Mar 17, 2014 at 4:01 PM, anny9699 wrote: > Hi Andrew, > > Thanks for the reply. However I did almost t

Re: java.lang.NullPointerException met when computing new RDD or use .count

2014-03-17 Thread anny9699
Hi Andrew, Thanks for the reply. However I did almost the same thing in another closure: val simi=dataByRow.map(point => { val corrs=dataByRow.map(x => arrCorr(point._2,x._2)) (point._1,corrs) }) here dataByRow is of format RDD[(Int,Array[Double])] and arrCorr is a function that I wrote to compu

Re: java.lang.NullPointerException met when computing new RDD or use .count

2014-03-17 Thread Andrew Ash
It looks like you're trying to access an RDD ("D") from inside a closure -- the parameter to the first map) which isn't possible with the current implementation of Spark. Can you rephrase to not access D from inside the map call? On Mon, Mar 17, 2014 at 10:36 AM, anny9699 wrote: > Hi, > > I me