LuciferYang commented on code in PR #49723: URL: https://github.com/apache/spark/pull/49723#discussion_r1933541728
########## core/src/test/scala/org/apache/spark/LocalRootDirsTest.scala: ########## @@ -46,7 +46,10 @@ trait LocalRootDirsTest extends SparkFunSuite with LocalSparkContext { override def afterEach(): Unit = { try { - Utils.deleteRecursively(tempDir) + if (!ShutdownHookManager.hasShutdownDeleteDir(tempDir) && Review Comment: https://github.com/apache/spark/blob/42898c62b4c197e88b842eaa840986cb70d93a9c/core/src/test/scala/org/apache/spark/LocalRootDirsTest.scala#L42 In the `beforeEach` method, `tempDir = Utils.createTempDir(namePrefix = "local")` invokes `ShutdownHookManager.registerShutdownDeleteDir` to register `tempDir` for deletion via a shutdown hook. Therefore, if `tempDir` remains unchanged during the test case, manual cleanup is unnecessary. However, given that `tempDir` is defined as `protected var tempDir`, it is possible for it to be altered within the test case. Hence, additional checks are implemented before manually cleaning up `tempDir`: 1. `tempDir` has not been registered for cleanup via a shutdown hook. 2. `tempDir` is not a subpath of any path that has been registered for cleanup via a shutdown hook. This approach helps to minimize the risk of a race condition where multiple threads attempt to clean up the `tempDir` concurrently. It seems that this issue is more prevalent on macOS, so let's conduct several more rounds of testing to ensure its effectiveness. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org