Hello Spark experts - I’m running Spark jobs in cluster mode using a
dedicated cluster for each job. Is there a way to see how much compute time
each job takes via Spark APIs, metrics, etc.? In case it makes a
difference, I’m using AWS EMR - I’d ultimately like to be able to say this
job costs $X since it took Y minutes on Z instance types (assuming all of
the nodes are the same instance type), but I figure I could probably need
to get the Z instance type through EMR APIs.

Thanks!
Jack

Reply via email to