Thank you Ryan for the prompt reply
and thank you for warning about the spark version indeed 3.1.1 failed and 3.0.1
working without any issue
adding follwoing line to spark conf solved the error.
.set("spark.sql.extensions",
"org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions")
Zoltan,
The warning is that dynamic overwrites in general aren't recommended.
ReplacePartitions is the right operation to use for dynamic overwrite, we
just want to steer users away from dynamic overwrites in general.
The problem with dynamic overwrite is that its behavior depends on the
underlyi
Ismail,
MERGE INTO is supported through our SQL extensions, so you'll need to
enable them to get it working:
http://iceberg.apache.org/spark-configuration/#sql-extensions
Also, we found during the 0.11.0 release vote that Spark 3.1.1 has changes
that break the extensions. Spark 3.1 has not been r
Hi all
congratulations all for the new release 11, Im trying to create SCD 2 table
using new MERGE INTO feature
but getting "MERGE INTO TABLE is not supported temporarily." error cant see
what is wrong
using spark 3.1.1 and iceberg 0.11.0
full code is here :
https://github.com/ismailsimsek/i
Hey everyone,
I'm currently working on the INSERT OVERWRITE statement for Iceberg tables
in Impala.
Seems like ReplacePartitions is the perfect interface for this job:
https://github.infra.cloudera.com/CDH/iceberg/blob/cdpd-master/api/src/main/java/org/apache/iceberg/ReplacePartitions.java
IIUC
Hi all,
We just released our data skipping index library to open source and are
hoping this can also be input to a discussion on this topic. Please let us
know if you have any questions or feedback
https://github.com/xskipper-io/xskipper/
On 2021/01/29 04:39:07 Miao Wang wrote:
> Hi @OpenInx,