Hello, As far as I know, there is no API provided for tracking the execution memory of a Spark Worker node. For tracking the execution memory you will probably need to access the MemoryManager's onHeapExecutionMemoryPool and offHeapExecutionMemoryPool objects that track the memory allocated to tasks for execution memory and write to a log the amount for parsing it later or send it to the driver in a periodic interval to collect the memory usage in a central location. Either way, I think tracking execution memory may require code changes in Apache Spark. Hope this helps.
Regards, Muhib Ph.D. Student, FSU On Thu, Sep 20, 2018 at 4:46 PM Liu, Jialin <jial...@illinois.edu> wrote: > Hi there, > > I am currently using Spark cluster to run jobs but I really need to > collect the history of actually memory usage(that’s execution memory + > storage memory) of the job in the whole cluster. I know we can get the > storage memory usage through either Spark UI Executor page or > SparkContext.getMemoryExecutorStatus() API, but I could not get the real > time execution memory usage. > Is there anyway I can collect total memory usage? Thank you so much! > > Best, > Jialin Liu > --------------------------------------------------------------------- > To unsubscribe e-mail: user-unsubscr...@spark.apache.org >