Hi Elias

Thanks for letting me know. I have found it but we also need the option to 
register Avro Schema’s and use the registry when we write to Kafka. So we will 
create a serialisation version and when it works implement it into Flink and 
create a pull request for the community. 

Med venlig hilsen / Best regards
Lasse Nedergaard


> Den 12. sep. 2019 kl. 17.45 skrev Elias Levy <fearsome.lucid...@gmail.com>:
> 
> Just for a Kafka source:
> 
> https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kafka.html#the-deserializationschema
> 
> There is also a version of this schema available that can lookup the writer’s 
> schema (schema which was used to write the record) in Confluent Schema 
> Registry. Using these deserialization schema record will be read with the 
> schema that was retrieved from Schema Registry and transformed to a 
> statically provided( either through 
> ConfluentRegistryAvroDeserializationSchema.forGeneric(...) or 
> ConfluentRegistryAvroDeserializationSchema.forSpecific(...)).
> 
>> On Wed, Sep 11, 2019 at 1:48 PM Lasse Nedergaard <lassenederga...@gmail.com> 
>> wrote:
>> Hi. 
>> Do Flink have out of the Box Support for Kafka Schema registry for both 
>> sources and sinks?
>> If not, does anyone knows about a implementation we can build on so we can 
>> help make it general available in a future release. 
>> 
>> Med venlig hilsen / Best regards
>> Lasse Nedergaard
>> 

Reply via email to