wengh commented on code in PR #49535: URL: https://github.com/apache/spark/pull/49535#discussion_r1925772564
########## sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala: ########## @@ -3459,6 +3459,15 @@ object SQLConf { .checkValues(Set("legacy", "row", "dict")) .createWithDefaultString("legacy") + val PYSPARK_HIDE_TRACEBACK = + buildConf("spark.sql.execution.pyspark.udf.hideTraceback.enabled") + .doc( + "When true, only show the message of the exception from Python UDFs, " + + "hiding the stack trace.") + .version("4.0.0") + .booleanConf + .createWithDefault(false) Review Comment: Is there a use case where we only want to show only last k frames of the stack? I'm under the impression that we want to show full stack trace for user exceptions, and completely hide stack trace for library exceptions when the message is sufficient to identify the reason. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org