Hi,

I'm using Spark 1.5.1 and if I look at the JSON data for a running application, 
every Stage has an "executorRunTime" field associated with it which is 
typically a 7-digit number for the PageRank application running on a large (1.1 
GB) input. Does this represent the execution-time for the stage in 
milliseconds? It doesn't seem so because it doesn't match with what the SparkUI 
shows (the "Duration" field for a stage). So what does it represent? How do I 
get the duration of a stage?

Regards,
Sandeep Saraswat

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to