HyukjinKwon commented on code in PR #49535: URL: https://github.com/apache/spark/pull/49535#discussion_r1924528145
########## sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala: ########## @@ -3459,6 +3459,15 @@ object SQLConf { .checkValues(Set("legacy", "row", "dict")) .createWithDefaultString("legacy") + val PYSPARK_HIDE_TRACEBACK = + buildConf("spark.sql.execution.pyspark.udf.hideTraceback.enabled") + .doc( + "When true, only show the message of the exception from Python UDFs, " + + "hiding the stack trace.") + .version("4.0.0") + .booleanConf + .createWithDefault(false) Review Comment: another way is to create this conf as an int, and show the max depth of stacktrace but I don't feel strongly. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org