dejankrak-db commented on code in PR #50937: URL: https://github.com/apache/spark/pull/50937#discussion_r2096616976
########## sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/ApplyDefaultCollationToStringType.scala: ########## @@ -82,6 +88,43 @@ object ApplyDefaultCollationToStringType extends Rule[LogicalPlan] { } } + /** + * Fetches the default collation specified at the schema level in the given logical plan. + * If no schema-level collation is specified, returns None. + */ + private def fetchSchemaLevelDefaultCollation(plan: LogicalPlan): Option[String] = { + try { + plan match { + case CreateTable(ResolvedIdentifier(catalog: SupportsNamespaces, identifier), _, _, _, _) => + getCollationFromSchemaMetadata(catalog, identifier.namespace()) + case ReplaceTable( + ResolvedIdentifier(catalog: SupportsNamespaces, identifier), _, _, _, _) => + getCollationFromSchemaMetadata(catalog, identifier.namespace()) + case AddColumns(ResolvedTable(catalog: SupportsNamespaces, identifier, _, _), _) => Review Comment: Will ALTER TABLE path ever be hit here? I would expect the CREATE TABLE to stamp the table collation (either provided through CREATE TABLE command or inherited from the schema), so ALTER TABLE commands always have this on table level set, not needing to go to schema level. Or there could be a case for older tables created before table-level collation support was added in 16.3, doing alter for them (once this lands) will go to schema-level collation? But even in that case, I think ALTER SCHEMA collation should only affect new tables created afterwards, so this may be incorrect/undesired behavior which would need to be removed. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org