Hi Iceberg and Spark community,
I'd like to bring your attention to a recent change[1] in Spark 3.5.3 that
effectively breaks Iceberg's SparkSessionCatalog[2] and blocks Iceberg
upgrading to Spark 3.5.3[3].
SparkSessionCatalog, as a customized Spark V2 session catalog,
supports creating a V1 tabl
It's a buggy behavior that a custom v2 catalog (without extending
DelegatingCatalogExtension) expects Spark to still use the v1 DDL commands
to operate on the tables inside it. This is also why the third-party
catalogs (e.g. Unity Catalog and Apache Polaris) can not be used to
overwrite `spark_cata