p(t=>
>> Edge(t._1._1,t._2._1,distanceAmongPoints(t._1._2._1,t._1._2._2,t._2._2._1,t._2._2._2)))
>>
>> Code executes perfectly fine uptill here but when i try to use
>> "cartesienProduct" it got stuck i.e.
>
))
Code executes perfectly fine uptill here but when i try to use
"cartesienProduct" it got stuck i.e.
val count =cartesienProduct.count()
Any help to efficiently do this will be highly appreciated.
--
View this message in
context:http://apache-spark-user-
gt; From: mas.ha...@gmail.com
> To: user@spark.apache.org
> Subject: Incremently load big RDD file into Memory
>
>
> val locations = filelines.map(line => line.split("\t")).map(t =>
> (t(5).toLong, (t(2).toDouble, t(3).toDouble))).distinct().collect()
>
> val cartes
2)))
Code executes perfectly fine uptill here but when i try to use
"cartesienProduct" it got stuck i.e.
val count =cartesienProduct.count()
Any help to efficiently do this will be highly appreciated.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble