Re: Incremently load big RDD file into Memory

2015-04-08 Thread MUHAMMAD AAMIR
p(t=> >> Edge(t._1._1,t._2._1,distanceAmongPoints(t._1._2._1,t._1._2._2,t._2._2._1,t._2._2._2))) >> >> Code executes perfectly fine uptill here but when i try to use >> "cartesienProduct" it got stuck i.e. >

Re: Incremently load big RDD file into Memory

2015-04-08 Thread Guillaume Pitel
)) Code executes perfectly fine uptill here but when i try to use "cartesienProduct" it got stuck i.e. val count =cartesienProduct.count() Any help to efficiently do this will be highly appreciated. -- View this message in context:http://apache-spark-user-

RE: Incremently load big RDD file into Memory

2015-04-07 Thread java8964
gt; From: mas.ha...@gmail.com > To: user@spark.apache.org > Subject: Incremently load big RDD file into Memory > > > val locations = filelines.map(line => line.split("\t")).map(t => > (t(5).toLong, (t(2).toDouble, t(3).toDouble))).distinct().collect() > > val cartes

Incremently load big RDD file into Memory

2015-04-07 Thread mas
2))) Code executes perfectly fine uptill here but when i try to use "cartesienProduct" it got stuck i.e. val count =cartesienProduct.count() Any help to efficiently do this will be highly appreciated. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble