Re: Failed to deserialize Avro record

2020-06-09 Thread Ramana Uppala
Hi Arvid / Dawid, Yes we did small POC with custom Avro Row Deserializer which uses ConfluentRegistryAvroDeSerializationSchema and we are able to parse the message. We have Schema registry and users are given choice to produce with different serialization mechanisms. Some messages we are able t

Re: Failed to deserialize Avro record

2020-06-09 Thread Dawid Wysakowicz
Good to hear. There is no schema that would support all ways. I would also rather discourage such approach, as it makes it really hard to make changes to the records schema. I would strongly recommend using schema registry for all records. If you still want to have a schema that would work for bo

Re: Failed to deserialize Avro record

2020-06-09 Thread Arvid Heise
If data is coming from Kafka, the write schema is most likely stored in a Schema Registry. If so, you absolutely need to use ConfluentRegistryAvroSerializationSchema of the *flink-avro-confluent-registry* package. If you didn't opt for that most common architecture pattern, then you often run into

Re: Failed to deserialize Avro record

2020-06-09 Thread Dawid Wysakowicz
It's rather hard to help if we don't know the format in which the records are serialized. There is a significant difference if you use a schema registry or not. All schema registries known to me prepend the actual data with some kind of magic byte and an identifier of the schema. Therefore if we do