Weichen Xu created SPARK-51666:
----------------------------------

             Summary:  Fix sparkStageCompleted executorRunTime metric 
calculation
                 Key: SPARK-51666
                 URL: https://issues.apache.org/jira/browse/SPARK-51666
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 4.1.0
            Reporter: Weichen Xu


Fix sparkStageCompleted executorRunTime metric calculation:

In case of when a spark task uses multiple CPU’s, the CPU seconds should 
capture the total execution seconds across all CPU’s. i.e. if a stage set 
cpus-of-task to be 48, if we used 10 seconds on each CPU, the total CPU seconds 
for that stage should be 10 seconds X 1 Tasks X 48 CPU = 480 CPU-seconds. If 
another task only used 1 CPU then its total CPU seconds is 10 seconds X 1 CPU = 
10 CPU-Seconds.

*This is very important fix since spark introduces stage level scheduling (so 
that different stage tasks are configured with different number of CPUs) , 
without this fix, the data pipeline revenue calculation spreads DBUs evenly 
across these tasks.*



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to