Re: Flink SQL Cdc with schema changing

2021-05-06 Thread Taher Koitawala
Sure, here's the use case that I want to solve. I want to stream CDC records that are inserted in kafka via debezium. We want to capture all events of Debezium, with an alter table add column, modify column, inserts, updates and deletes over an avro based file format which can then be queried. For

Re: Flink SQL Cdc with schema changing

2021-05-05 Thread Jark Wu
Hi Taher, Could you explain a bit more your use case and what do you expect Flink SQL to support? That could help us to better understand and plan the future roadmap. Best, Jark On Wed, 5 May 2021 at 19:42, Taher Koitawala wrote: > Thank you for the reply Jack Wu, however that does not satisfy

Re: Flink SQL Cdc with schema changing

2021-05-05 Thread Taher Koitawala
Thank you for the reply Jack Wu, however that does not satisfy my requirements, my use case is to have something that supports a schema drift over avro format. Column addition and column datatype change both types of variations is what I am trying to solve for. Either way thanks for the help, much

Re: Flink SQL Cdc with schema changing

2021-05-05 Thread Jark Wu
Hi Taher, Currently, Flink (SQL) CDC doesn't support automatic schema change and doesn't support to consume schema change events in source. But you can upgrade schema manually, for example, if you have a table with columns [a, b, c], you can define a flink table t1 with these 3 columns. When you a