Hi Arvid / Dawid,
Yes we did small POC with custom Avro Row Deserializer which uses
ConfluentRegistryAvroDeSerializationSchema and we are able to parse the message.
We have Schema registry and users are given choice to produce with different
serialization mechanisms. Some messages we are able t
Good to hear.
There is no schema that would support all ways. I would also rather
discourage such approach, as it makes it really hard to make changes to
the records schema. I would strongly recommend using schema registry for
all records.
If you still want to have a schema that would work for bo
If data is coming from Kafka, the write schema is most likely stored in a
Schema Registry. If so, you absolutely need to use
ConfluentRegistryAvroSerializationSchema of the
*flink-avro-confluent-registry* package.
If you didn't opt for that most common architecture pattern, then you often
run into
It's rather hard to help if we don't know the format in which the
records are serialized. There is a significant difference if you use a
schema registry or not. All schema registries known to me prepend the
actual data with some kind of magic byte and an identifier of the
schema. Therefore if we do