wuchong commented on a change in pull request #14743: URL: https://github.com/apache/flink/pull/14743#discussion_r575297850
########## File path: docs/content.zh/docs/connectors/table/kafka.md ########## @@ -499,12 +499,13 @@ If `timestamp` is specified, another config option `scan.startup.timestamp-milli If `specific-offsets` is specified, another config option `scan.startup.specific-offsets` is required to specify specific startup offsets for each partition, e.g. an option value `partition:0,offset:42;partition:1,offset:300` indicates offset `42` for partition `0` and offset `300` for partition `1`. -### Changelog Source +### CDC Changelog Source + +Flink natively supports Kafka as a CDC changelog source. If messages in Kafka topic is change event captured from other databases using CDC tools, then you can use a CDC format to interpret messages as INSERT/UPDATE/DELETE messages into Flink SQL system. + +Flink provides three CDC formats: [debezium-json]({% link dev/table/connectors/formats/debezium.md %}), [canal-json]({% link dev/table/connectors/formats/canal.md %}) and [maxwell-json]({% link dev/table/connectors/formats/maxwell.md %}) to interpret change events captured by [Debezium](https://debezium.io/), [Canal](https://github.com/alibaba/canal/wiki) and [Maxwell](https://maxwells-daemon.io/). Review comment: We also support `debezium-avro-confluent` now. The documentation is also located in https://ci.apache.org/projects/flink/flink-docs-master/zh/docs/connectors/table/formats/debezium/ ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org