Hi Russell,
Thanks for testing it out! It's a bit unfortunate that we found this issue
after the RC stage. I've made a fix for it:
https://github.com/apache/spark/pull/48257 . I think it should work but
let's confirm it. After it gets merged, we can probably wait for a while to
accumulate more fix
Checked and extending Delegating Catalog Extension will be quite difficult
or at least cause several breaks in current Iceberg SparkSessionCatalog
implementations. Note this has nothing to do with third party catalogs but
more directly with how Iceberg works with Spark regardless of Catalog
impleme
I think it should be minimally difficult to switch this around on the
Iceberg side, we only have to move the initialize code out and duplicate
it. Not a huge cost
On Sun, Sep 22, 2024 at 11:39 PM Wenchen Fan wrote:
> It's a buggy behavior that a custom v2 catalog (without extending
> DelegatingC
It's a buggy behavior that a custom v2 catalog (without extending
DelegatingCatalogExtension) expects Spark to still use the v1 DDL commands
to operate on the tables inside it. This is also why the third-party
catalogs (e.g. Unity Catalog and Apache Polaris) can not be used to
overwrite `spark_cata
Hi Iceberg and Spark community,
I'd like to bring your attention to a recent change[1] in Spark 3.5.3 that
effectively breaks Iceberg's SparkSessionCatalog[2] and blocks Iceberg
upgrading to Spark 3.5.3[3].
SparkSessionCatalog, as a customized Spark V2 session catalog,
supports creating a V1 tabl