szehon-ho commented on code in PR #49840: URL: https://github.com/apache/spark/pull/49840#discussion_r1945608585
########## sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/ResolveDefaultColumnsUtil.scala: ########## @@ -52,6 +52,17 @@ object ResolveDefaultColumns extends QueryErrorsBase // CURRENT_DEFAULT_COLUMN_METADATA. val CURRENT_DEFAULT_COLUMN_NAME = "DEFAULT" + var defaultColumnAnalyzer: Analyzer = DefaultColumnAnalyzer + var defaultColumnOptimizer: Optimizer = DefaultColumnOptimizer + + /** + * Visible for testing + */ + def setAnalyzerAndOptimizer(analyzer: Analyzer, optimizer: Optimizer): Unit = { Review Comment: Its hard to reproduce the issue in unit test, so I end up mocking these members to verify that the catalogs are not called. let me know, Im totally fine with removing this. ########## sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/ResolveDefaultColumnsUtil.scala: ########## @@ -52,6 +52,17 @@ object ResolveDefaultColumns extends QueryErrorsBase // CURRENT_DEFAULT_COLUMN_METADATA. val CURRENT_DEFAULT_COLUMN_NAME = "DEFAULT" + var defaultColumnAnalyzer: Analyzer = DefaultColumnAnalyzer + var defaultColumnOptimizer: Optimizer = DefaultColumnOptimizer + + /** + * Visible for testing + */ + def setAnalyzerAndOptimizer(analyzer: Analyzer, optimizer: Optimizer): Unit = { Review Comment: Its hard to reproduce the issue in unit test, so I end up mocking these members to verify that the catalogs are not called. let me know, Im totally fine with removing this if we think existing test coverage is ok. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org