Have you tried this?

   1. Copy the value of `spark.sql.extensions` (a comma separated
   list) from an existing Spark job on Databricks
   2. Append
   `org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions`  to the
   comma separated list
   3. Submit a new job with `spark.sql.extensions` set to the modified value


On Fri, Apr 28, 2023 at 12:28 PM Pani Dhakshnamurthy <dpa...@gmail.com>
wrote:

> Hi all,
>
> I am currently attempting to make Iceberg work on the Databricks platform.
> Without adding IcebergSparkSessionExtensions, I am able to perform both
> data and metadata reads. However, when I add IcebergSparkSessionExtensions
> with spark.sql.extensions, Databricks throws the following error:
>
> Caused by: java.lang.NoClassDefFoundError:
> org/apache/spark/sql/catalyst/analysis/ResolveProcedures
> ... 41 more
> Caused by: java.lang.ClassNotFoundException:
> org.apache.spark.sql.catalyst.analysis.ResolveProcedures
> at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:419)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:352)
> ... 41 more
>
> If I don't add IcebergSparkSessionExtensions, I receive the following
> error from DBR when I execute UPDATE SQL: Error in SQL statement:
> DeltaAnalysisException: UPDATE destination only supports Delta sources.
>
> The Databricks 12.2 LTS (includes Apache Spark 3.3.2, Scala 2.12) runtime
> comes with a shaded Iceberg jar with version 0.11. For this testing, I
> removed it and added iceberg-spark-runtime-3.3_2.12-1.2.1.jar to the
> classpath.
>
> Has anyone successfully used Databricks to perform all Iceberg supported
> features? Any help would be greatly appreciated. Thank you in advance.
>
> Best regards,
> Pani
>


-- 
John Zhuge

Reply via email to