JoshRosen commented on code in PR #50167:
URL: https://github.com/apache/spark/pull/50167#discussion_r1984351866


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:
##########
@@ -6623,12 +6623,14 @@ class SQLConf extends Serializable with Logging with 
SqlApiConf {
   /** Return the value of Spark SQL configuration property for the given key. 
*/
   @throws[NoSuchElementException]("if key is not set")
   def getConfString(key: String): String = {
-    Option(settings.get(key)).
-      orElse {
-        // Try to use the default value
-        Option(getConfigEntry(key)).map { e => 
e.stringConverter(e.readFrom(reader)) }
-      }.
-      getOrElse(throw QueryExecutionErrors.sqlConfigNotFoundError(key))
+    getConfStringOption(key).getOrElse(throw 
QueryExecutionErrors.sqlConfigNotFoundError(key))
+  }
+
+  private[sql] def getConfStringOption(key: String): Option[String] = {
+    Option(settings.get(key)).orElse {
+      // Try to use the default value
+      Option(getConfigEntry(key)).map { e => 
e.stringConverter(e.readFrom(reader)) }

Review Comment:
   I noticed that there's a subtle behavior change in this updated code:
   
   If `e.stringConverter` returns `null` then the old code would convert this 
to `None` and the new code will return `null`, which can lead to NPE's in the 
`SET` command when retrieving the value of an unset `OptionalConfigEntry`.
   
   I have some ideas for fix-forward, but I want more time to comprehensively 
beef up the test suite for these config pieces. For now I think we should 
revert this. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to