Re: Wish to join mailing List

2023-06-30 Thread Eduard Tudenhoefner
Hi Ratnesh, you should be able to self-subscribe via the link under https://iceberg.apache.org/community/#mailing-lists. Eduard On Fri, Jun 30, 2023 at 6:25 PM Ratnesh Mishra wrote: > Hi , > > I want to subscribe to the Iceberg official Mailing List. > > Thanks, > Ratnesh >

Wish to join mailing List

2023-06-30 Thread Ratnesh Mishra
Hi , I want to subscribe to the Iceberg official Mailing List. Thanks, Ratnesh

Re: How to remove an Iceberg partition that only contains parquet files with 0 record

2023-06-30 Thread Manu Zhang
Okay, maybe I took this sentence in the alter table doc too serious 😂 > Iceberg has full ALTER TABLE support in Spark 3 > On Fri, Jun 30, 2023 at 10:11 PM Pucheng Yang wrote: > Thanks Russell, will take a look today. > > On Fri, J

Re: How to remove an Iceberg partition that only contains parquet files with 0 record

2023-06-30 Thread Pucheng Yang
Thanks Russell, will take a look today. On Fri, Jun 30, 2023 at 7:08 AM wrote: > You probably will need to manually delete the file entry using the table > api from Java > > Sent from my iPhone > > On Jun 30, 2023, at 6:58 AM, Pucheng Yang > wrote: > >  > > Hi Manu, the table has already been

Re: How to remove an Iceberg partition that only contains parquet files with 0 record

2023-06-30 Thread russell . spitzer
You probably will need to manually delete the file entry using the table api from JavaSent from my iPhoneOn Jun 30, 2023, at 6:58 AM, Pucheng Yang wrote:Hi Manu, the table has already been migrated to Iceberg and I think your command only available to Hive table. It seems won’t help my case. Appr

Re: How to remove an Iceberg partition that only contains parquet files with 0 record

2023-06-30 Thread Pucheng Yang
Hi Manu, the table has already been migrated to Iceberg and I think your command only available to Hive table. It seems won’t help my case. Appreciate your response! On Thu, Jun 29, 2023 at 11:38 PM Manu Zhang wrote: > You may try following SQL which is supported by Spark > > > alter table ident