Hi, 

I'm running out of memory when I run a GraphX program for dataset moe than
10 GB, It was handle pretty well in case of noraml spark operation when did
StorageLevel.MEMORY_AND_DISK. 

In case of GraphX I found its only allowed storing in memory, and it is
because in Graph constructor, this property set by default. When I changed
storage level as per my requirement,  it doesn't allow and throw Error
Message sayinh "Cannot Modify StorageLevel when Its already set" 

Please help me on these queries : 
1 > How to override current staorge level to MEMORY and DISK ? 
2 > If its not possible through constructor, what If I modify Graph.scala
class and rebuild it to make it work? By applying this, is there any other
things I need know? 

Thanks           



-----
--Harihar
--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Can-we-make-EdgeRDD-and-VertexRDD-storage-level-to-MEMORY-AND-DISK-tp19307.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to