Hi.
Yes. Flink supports to write the value to the Kafka record key parts. You
just need to specify which column belongs to the key in the WITH blocks,
e.g.
```
CREATE TABLE kafka_sink (
...
) WITH (
`key.fields` = 'id'
);
```
[1]
https://nightlies.apache.org/flink/flink-docs-master/docs/conne
Hey wang!
Perhaps this is what you want:
https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/connectors/table/kafka/#key-format
&
https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/connectors/table/kafka/#key-fields
?
Note that the fields *have* to be one of the "top" lev
Hi dear engineer,
Flink sql supports kafka sink table, not sure whether it supports kafka key in
kafka sink table? As I want to specify kafka key when inserting data into kafka
sink table.
Thanks for your answer in advance.
Thanks && Regards,
Hunk