Neil Ramaswamy created SPARK-51011:
--------------------------------------

             Summary: Add logging for whether a task is going to be interrupted 
when killing tasks
                 Key: SPARK-51011
                 URL: https://issues.apache.org/jira/browse/SPARK-51011
             Project: Spark
          Issue Type: Improvement
          Components: Spark Core
    Affects Versions: 4.0.0
            Reporter: Neil Ramaswamy


I've noticed the following sequence of events are possible with a streaming 
query:
 # The maintenance thread grabs a lock on a RocksDB instance
 # It does an unexpectedly long operation (i.e. close the DB, which does a 
bunch of cleanup)
 # A task thread from a _cancelled_ stage is in [this 
loop|https://github.com/apache/spark/blob/fef1b2375c3074cb3b53d5c29df1aa27c269469c/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/state/RocksDB.scala#L1116]
 to acquire the lock and can never get out.
 # The task reaper can't kill the task thread (since it is stuck) in 60 
seconds, so it kills the JVM.

This took a while to figure out, but it was amplified by the fact that I 
couldn't tell for sure whether a Java interrupt was issued by the Executor when 
this stage was cancelled. It would be helpful to log this in the TaskRunner 
kill method. Effectively, I'm suggesting that:
{code:java}
25/01/27 23:49:32 INFO Executor: Executor is trying to kill foo-thread, reason: 
<whatever>{code}
be changed to:
{code:java}
25/01/27 23:49:32 INFO Executor: Executor is trying to kill foo-thread, 
interruptThread=true, reason: <whatever> {code}
A separate fix should be made for the streaming query issue (i.e. being in a 
potentially tight loop without checking the task context's interrupt flag), but 
since that happens in many places and requires a holistic fix, it will be 
addressed in a separate ticket.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to