Correct, to me it looks like a Spark bug
https://issues.apache.org/jira/browse/SPARK-51821 that may be hard to trigger
and is reproduce using the test case provided in
https://github.com/apache/spark/pull/50594:
1. Spark UninterruptibleThread “task” is interrupted by “test” thread while
“task”
Correct me if I'm wrong: this is a long-standing Spark bug that is very
hard to trigger, but the new Parquet version happens to hit the trigger
condition and exposes the bug. If this is the case, I'm +1 to fix the Spark
bug instead of downgrading the Parquet version.
Let's move the technical discu