Hi there, We've encountered Spark executor Java OOM issues for our Spark application. Any tips on how to troubleshoot to identify what objects are occupying the heap? In the past, dealing with JVM OOM, we've worked with analyzing heap dumps, but we are having a hard time with locating Spark heap dump after a crash, and we also anticipate that these heap dump will be huge (since our nodes have a large memory allocation) and may be difficult to analyze locally. Can someone share their experience dealing with Spark OOM?
Thanks!