Hi,
If you have heap problems in spark/graphx, it'd be better to split
partitions
into smaller ones so as to fit the partition on memory.
On Sat, Mar 14, 2015 at 12:09 AM, Hlib Mykhailenko <
hlib.mykhaile...@inria.fr> wrote:
> Hello,
>
> I cannot process graph with 230M edges.
> I cloned apache.
Hello,
I cannot process graph with 230M edges.
I cloned apache.spark, build it and then tried it on cluster.
I used Spark Standalone Cluster:
-5 machines (each has 12 cores/32GB RAM)
-'spark.executor.memory' == 25g
-'spark.driver.memory' == 3g
Graph has 231359027 edges. And its file weig