pan3793 commented on code in PR #54517:
URL: https://github.com/apache/spark/pull/54517#discussion_r2877766817
##########
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:
##########
@@ -4754,6 +4743,16 @@ object SQLConf {
.enumConf(StoreAssignmentPolicy)
.createWithDefault(StoreAssignmentPolicy.ANSI)
+ val FILE_SOURCE_INSERT_ENFORCE_NOT_NULL =
+ buildConf("spark.sql.fileSource.insert.enforceNotNull")
Review Comment:
> but this one actually takes no effect on the file-only mode
it's true because a file-based table without a catalog can not store
constraint info, but is that worth a new namespace for config? I can not infer
such info from the proposed namespace name `fileSource`. IMO, it still fits the
`spark.sql.files.` scope, and we can mention such a limitation in the config
docs, if necessary.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]