t;
> *From: *Aurélien Mazoyer
> *Date: *Monday, September 6, 2021 at 5:47 AM
> *To: *Haryani, Akshay
> *Cc: *user@spark.apache.org
> *Subject: *Re: Get application metric from Spark job
>
> Hi Akshay,
>
>
>
> Thank you for your reply. Sounds like a good idea, but I un
& Regards,
Akshay Haryani
From: Aurélien Mazoyer
Date: Monday, September 6, 2021 at 5:47 AM
To: Haryani, Akshay
Cc: user@spark.apache.org
Subject: Re: Get application metric from Spark job
Hi Akshay,
Thank you for your reply. Sounds like a good idea, but I unfortunately have a
2.6 cluster. Do
Hi Akshay,
Thank you for your reply. Sounds like a good idea, but I unfortunately have
a 2.6 cluster. Do you know if there would be another solution that would
run on 2.6 or if I have no other choice than migrating to 3?
Regards,
Aurélien
Le jeu. 2 sept. 2021 à 20:12, Haryani, Akshay a
écrit :
Hi Aurélien,
Spark has endpoints to expose the spark application metrics. These endpoints
can be used as a rest API. You can read more about these here:
https://spark.apache.org/docs/3.1.1/monitoring.html#rest-api
Additionally,
If you want to build your own custom metrics, you can explore spark