I don't have a proper answer to this. But to circumvent if you have 2
independent Spark jobs, you could update one when the other is serving
reads. But it's still not scalable for incessant updates.
Regards
Sab
On 25-Feb-2016 7:19 pm, "Udbhav Agarwal" wrote:
> Hi,
>
> I am using graphx. I am add
Hi,
I am using graphx. I am adding a batch of vertices to a graph with around
100,000 vertices and few edges. Adding around 400 vertices is taking 7 seconds
with one machine of 8 core and 8g ram. My trouble is when this process of
addition is happening with the graph(name is inputGraph) am not a