Hello,

I am currently trying to monitor the progression of jobs. I created a class
extending SparkListener, added a jobProgressListener to my sparkContext, and
overrided the methods OnTaskStart, OnTaskEnd, OnJobStart and OnJobEnd, which
leads to good results.

Then, I would also like to monitor the progression of one job in comparison
to the global progression of all jobs. I guess this is not directly
possible, so I would like to retrieve the list of all jobs (or at least, the
number of jobs), so that I can approximate the global progression by
dividing the progression of one job by the total number of jobs.

However, I do not find how to do this. I searched through the
JobProgressListener API, but I only found methods to get the list of active
jobs, or the list of already completed jobs. Is there a way to get the
number or the list of jobs in the current version of Spark ?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Is-there-a-way-to-get-the-list-of-all-jobs-tp22635.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to