vrozov commented on code in PR #50594: URL: https://github.com/apache/spark/pull/50594#discussion_r2064088121
########## core/src/test/scala/org/apache/spark/util/UninterruptibleThreadSuite.scala: ########## @@ -115,6 +116,45 @@ class UninterruptibleThreadSuite extends SparkFunSuite { assert(interruptStatusBeforeExit) } + test("no runUninterruptibly") { + @volatile var hasInterruptedException = false + val t = new UninterruptibleThread("test") { + override def run(): Unit = { + if (sleep(0)) { + hasInterruptedException = true + } + } + } + t.interrupt() + t.start() + t.join() + assert(hasInterruptedException === true) + } + + test("SPARK-51821 uninterruptibleLock deadlock") { + val latch = new CountDownLatch(1) + val task = new UninterruptibleThread("task thread") { + override def run(): Unit = { + val channel = new AbstractInterruptibleChannel() { + override def implCloseChannel(): Unit = { + begin() + latch.countDown() + try { + Thread.sleep(Long.MaxValue) + } catch { + case _: InterruptedException => Thread.currentThread().interrupt() + } + } + } + channel.close() Review Comment: @mridulm @Ngone51 Streaming requires `UninterruptibleThread`, please see SPARK-21248. I also don't think that it is necessary to revisit usage of `UninterruptibleThread`. The `run()` method is not affected at all. The only affected method (overridden) is `interrupt()` and with the fix it also won't be impacted. The only difference with `Thread.interrupt()` is acquiring `uninterruptibleLock` that is a low cost operation when there is no contention (multiple threads calling interrupt() concurrently) and as `Thread.interrupt()` acquires `blockerLock` as well, there is pretty much no difference at all. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org