Spark TaskMetrics[1] has a "jvmGCTime" metric that captures the amount of
time spent in GC. This is also available via the listener I guess.

Thanks,
Arun

[1]
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/executor/TaskMetrics.scala#L89


On Mon, 15 Apr 2019 at 09:52, Eugene Koifman <eugene.koif...@workday.com>
wrote:

> Hi,
>
> A number of projects in Hadoop echo system use
> org.apache.hadoop.util.JvmPauseMonitor (or clones of it) to log long GC
> pauses.
>
> Is there something like that for a Spark Executor, that can make a log
> entry based on GC time exceeding a configured limit?
>
>
>
> Thank you,
>
> Eugene
>
>

Reply via email to