[ 
https://issues.apache.org/jira/browse/FLINK-31085?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17701032#comment-17701032
 ] 

Martijn Visser commented on FLINK-31085:
----------------------------------------

Note: this change also includes a change in the Kafka Connector test, which 
(normally, haven't looked at any details) needs to be moved to 
flink-connector-kafka since we'll move the Kafka connector out of the Flink 
repo in 1.18

> Add schema option to confluent registry avro formats
> ----------------------------------------------------
>
>                 Key: FLINK-31085
>                 URL: https://issues.apache.org/jira/browse/FLINK-31085
>             Project: Flink
>          Issue Type: Improvement
>          Components: Formats (JSON, Avro, Parquet, ORC, SequenceFile)
>            Reporter: Ferenc Csaky
>            Assignee: Ferenc Csaky
>            Priority: Major
>              Labels: pull-request-available
>             Fix For: 1.18.0
>
>
> When using {{avro-confluent}} and {{debezium-avro-confluent}} formats with 
> schemas already defined in the Confluent Schema Registry, serialization 
> fails, because Flink uses a default name `record` when converting row types 
> to avro schema. So if the predefined schema has a different name, the 
> serialization schema will be incompatible with the registered schema due to 
> name mismatch. Check 
> [this|https://lists.apache.org/thread/5xppmnqjqwfzxqo4gvd3lzz8wzs566zp] 
> thread about reproducing the issue.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to