Hi Team,

I'm playing around with creating a Flink Dynamic Sink which would allow
schema changes without the need for job restart. So when a record with an
unknown schema arrives, then it would update the Iceberg table to the new
schema and continue processing the records.

Lets's say, I have the `Schema newSchema` and `PartitionSpec newSpec` at
hand, and I have the `Table icebergTable` with a different Schema and
PartitionSpec. I know, that we have the `Table.updateSchema` and
`Table.updateSpec` to modify them, but these methods in the API only allow
for incremental changes (addColumn, updateColumn, or addField,
removeField). Do we have an existing API for effectively updating the
Iceberg Table schema/spec to a new one, if we have the target schema and
spec at hand?

Thanks,
Peter

Reply via email to