This is indeed not optimal. Could you file a JIRA issue to add this
functionality? Thanks a lot Yuval.
Cheers,
Till
On Thu, Aug 20, 2020 at 9:47 AM Yuval Itzchakov wrote:
> Hi Till,
> KafkaSerializationSchema is only pluggable for the DataStream API, not for
> the Table API. KafkaTableSink hard
Hi Till,
KafkaSerializationSchema is only pluggable for the DataStream API, not for
the Table API. KafkaTableSink hard codes a KeyedSerializationSchema that
uses a null key, and this behavior can't be overridden.
I have to say I was quite surprised by this behavior, as publishing events
to Kafka u
Hi Yuval,
Unfortunately setting the key or timestamp (or other metadata) from the
SQL API is not supported yet. There is an ongoing discussion to support
it[1].
Right now your option would be to change the code of KafkaTableSink and
write your own version of KafkaSerializationSchema as Till menti
Hi Yuval,
it looks as if the KafkaTableSink only supports writing out rows without a
key. Pulling in Timo for verification.
If you want to use a Kafka producer which writes the records out with a
key, then please take a look at KafkaSerializationSchema. It supports this
functionality.
Cheers,
Ti
Hi,
I'm running Flink 1.9.0 and I'm trying to set the key to be published by
the Table API's Kafka Connector. I've searched the documentation by
could find no reference for such an ability.
Additionally, while browsing the code of the KafkaTableSink, it looks like
it creates a KeyedSerializationS