summaryzb opened a new pull request, #50489: URL: https://github.com/apache/spark/pull/50489
### What changes were proposed in this pull request? This pr makes it possible to swallow SparkRuntimeException when assemble the `Serialization stack`, this behavior is compatible with `SPARK-7187`, which ensure `SerializationDebugger` should not crash user code ### Why are the changes needed? It is recommended to show the `Serialization stack` other than the unrelated exception underlying, this is helpful for debug the real problem ### Does this PR introduce _any_ user-facing change? Yes, user will see the direct serialization exception and the reference chain beyond the root cause. Before this pr, user will get confuse when below exception is shown ``` WARN org.apache.spark.serializer.SerializationDebugger: Exception in serialization debugger org.apache.spark.SparkRuntimeException: Cannot get SQLConf inside scheduler event loop thread. at org.apache.spark.sql.errors.QueryExecutionErrors$.cannotGetSQLConfInSchedulerEventLoopThreadError(QueryExecutionErrors.scala:2002) at org.apache.spark.sql.internal.SQLConf$.get(SQLConf.scala:225) at org.apache.spark.sql.execution.ScalarSubquery.toString(subquery.scala:69) at java.lang.String.valueOf(String.java:2994) at scala.collection.mutable.StringBuilder.append(StringBuilder.scala:203) at scala.collection.immutable.Stream.addString(Stream.scala:701) at scala.collection.TraversableOnce.mkString(TraversableOnce.scala:377) org.apache.spark.SparkException: Job aborted due to stage failure: Task not serializable: java.io.NotSerializableException: ``` ### How was this patch tested? Pass GitHub Actions ### Was this patch authored or co-authored using generative AI tooling? No -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org