Hi Akshay,

Thank you for your reply. Sounds like a good idea, but I unfortunately have
a 2.6 cluster. Do you know if there would be another solution that would
run on 2.6 or if I have no other choice than migrating to 3?

Regards,

Aurélien

Le jeu. 2 sept. 2021 à 20:12, Haryani, Akshay <akshay.hary...@hpe.com> a
écrit :

> Hi Aurélien,
>
>
>
> Spark has endpoints to expose the spark application metrics. These
> endpoints can be used as a rest API. You can read more about these here:
> https://spark.apache.org/docs/3.1.1/monitoring.html#rest-api
>
>
>
> Additionally,
>
> If you want to build your own custom metrics, you can explore spark custom
> plugins. Using a custom plugin, you can track your own custom metrics and
> plug it into the spark metrics system. Please note plugins are supported
> on spark versions above 3.0.
>
>
>
>
>
> --
>
> Thanks & Regards,
>
> Akshay Haryani
>
>
>
> *From: *Aurélien Mazoyer <aurel...@aepsilon.com>
> *Date: *Thursday, September 2, 2021 at 8:36 AM
> *To: *user@spark.apache.org <user@spark.apache.org>
> *Subject: *Get application metric from Spark job
>
> Hi community,
>
>
>
> I would like to collect information about the execution of a Spark job
> while it is running. Could I define some kind of application metrics (such
> as a counter that would be incremented in my code) that I could retrieve
> regularly while the job is running?
>
>
> Thank you for help,
>
>
>
> Aurelien
>

Reply via email to