The Spark UI has timing information. When running locally, it is at
http://localhost:4040
Otherwise the url to the UI is printed out onto the console when you
startup spark shell or run a job.
Reza
On Fri, Oct 24, 2014 at 5:51 AM, shahab wrote:
> Hi,
>
> I just wonder if there is any built-in f
Hi,
I just wonder if there is any built-in function to get the execution time
for each of the jobs/tasks ? in simple words, how can I find out how much
time is spent on loading/mapping/filtering/reducing part of a job? I can
see printout in the logs but since there is no clear presentation of the