I was wrong here.
I am using spark standalone cluster and I am not using YARN or MESOS. Is it
possible to track spark execution memory?.
On Mon, Oct 21, 2019 at 5:42 PM Sriram Ganesh wrote:
> I looked into this. But I found it is possible like this
>
> https://github.com/apache/spark/blob/maste
I looked into this. But I found it is possible like this
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/status/AppStatusListener.scala#L229
Line no 230. This is for executors.
Just wanna cross verify is that right?
On Mon, 21 Oct 2019, 17:24 Alonso Isidoro Rom
Hi,
I wanna monitor how much memory executor and task used for a given job. Is
there any direct method available for it which can be used to track this
metric?
--
*Sriram G*
*Tech*