Hi Dom,

I think support for Kafka keys would be covered by Timo's proposal for
improvements of the source / sink connectors [1].
See the section on "Concat multiple formats for accessing
connector-specific properties" in the proposal document [2].

Best, Fabian

[1]
https://lists.apache.org/thread.html/f9589cd48996c6a39c90ccb4eae201177610ef50358d9bd936ca230d@%3Cdev.flink.apache.org%3E
[2]
https://docs.google.com/document/d/1Yaxp1UJUFW-peGLt8EIidwKIZEWrrA-pznWLuvaH39Y

Am Di., 23. Okt. 2018 um 14:46 Uhr schrieb Dominik Wosiński <
wos...@gmail.com>:

> Hey,
> I don't think we are currently supporting this, but it would be a good idea
> to have Kafka sink with support for keys. I have worked on something
> similar to SQL Client before it was created, but the keys in Kafka are
> crucial to us and this is currently the limitation that keeps us from
> switching to SQL-Client.
>
> This shouldn't be really that hard as the name of the field to be used as a
> key could be defined in the environment file, possibly when defining the
> schema. This would be then used in the *KafkaTableSourceSinkFactory* to
> create the *KeyedSerializationSchema *instead of wrapper which is used
> currently.
>
> For some applications keys are crucial and this would be the next step to
> allow creating fully detached jobs from SQL Client.
>
> Please, let me know what do You think about this.
>
> Best Regards,
> Dom.
>

Reply via email to