Ngone51 commented on code in PR #47578:
URL: https://github.com/apache/spark/pull/47578#discussion_r1703580159


##########
core/src/main/scala/org/apache/spark/executor/TaskMetrics.scala:
##########
@@ -272,8 +270,9 @@ class TaskMetrics private[spark] () extends Serializable {
    */
   @transient private[spark] lazy val _externalAccums = new 
ArrayBuffer[AccumulatorV2[_, _]]
 
-  private[spark] def externalAccums: Seq[AccumulatorV2[_, _]] = withReadLock {
-    _externalAccums.toArray.toImmutableArraySeq
+  private[spark] def withExternalAccums[T](op: ArrayBuffer[AccumulatorV2[_, 
_]] => T)
+    : T = withReadLock {
+    op(_externalAccums)

Review Comment:
   > Here a read lock is used, but how can we ensure that the op here is a 
read-only operation?
   
   @LuciferYang Yea, I think there is no programming way to avoid it but only 
depends on developers. Let me add comments to clarify.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to