Hi, Thanks for reaching out! Unfortunately insert overwrite commands are not currently supported by the Hive engine in upstream Iceberg, since it's a feature difficult to implement without touching the Hive codebase as well. Currently we only have support for regular insert queries (using the mr engine, not tez).
We have recently created a new module in upstream Hive called "hive-iceberg-handler", which is supposed to be an alternative over time to using the iceberg-hive-runtime jar, providing an extended feature set and performance improvements. Support for insert overwrites has already been implemented there. Please note the module is still experimental and some of these newer features would require you to run Hive from the master branch due to core Hive API changes, but it might be a good candidate for experimentation. Here's the link for the module if you'd like to check out our ongoing work: https://github.com/apache/hive/tree/master/iceberg/iceberg-handler As for the specific error message you're getting, it's because the HMS by design sees the Iceberg table as unpartitioned (to enable flexible partitioning down the line and due to how the Hive query planner works), even though the underlying Iceberg table is actually partitioned - hence the upstream error during the compilation phase. Best, Marton On Mon, 10 May 2021 at 10:11, 1 <liubo1022...@126.com> wrote: > Error pic cannot upload, hive error is > > FAILED: ValidationFailureSemanticException table is not partitioned but > partition spec exists: {pt=xxx} > > liubo07199 > liubo07...@hellobike.com > > <https://maas.mail.163.com/dashi-web-extend/html/proSignature.html?ftlId=1&name=liubo07199&uid=liubo07199%40hellobike.com&iconUrl=https%3A%2F%2Fmail-online.nosdn.127.net%2Fqiyelogo%2FdefaultAvatar.png&items=%5B%22liubo07199%40hellobike.com%22%5D> > 签名由 网易邮箱大师 <https://mail.163.com/dashi/dlpro.html?from=mail81> 定制 > > On 05/10/2021 16:07,1<liubo1022...@126.com> <liubo1022...@126.com> wrote: > > Hi, team > > When I migrate tables from hive to iceberg use spark3, partition info > in ddl is hidden. > > When I run *insert overwrite table xxx partition (pt='xxx’)* on > flink or spark sql-shell, it’s ok, but when I run it on hive sql-shell, I > get a error like below: > > > > So what can I do for it, specify the partition in ddl ? > > liubo07199 > > > <https://maas.mail.163.com/dashi-web-extend/html/proSignature.html?ftlId=1&name=liubo07199&uid=liubo07199%40hellobike.com&iconUrl=https%3A%2F%2Fmail-online.nosdn.127.net%2Fqiyelogo%2FdefaultAvatar.png&items=%5B%22liubo07199%40hellobike.com%22%5D> > >