Isca Harmatz Mon, 01 Dec 2014 21:20:49 -0800
hello, im running spark on a cluster and i want to monitor how many nodes/ cores are active in different (specific) points of the program.
is there any way to do this? thanks, Isca