: dev@flink.apache.org
Subject: [EXTERNAL] Re: Additional metadata available for Kafka serdes
Hi David!
I think passing the headers as a map (as opposed to
ConsumerRecord/ProducerRecord) is a great idea that should work. That way
the core Flink package doesn't have Kafka dependencies, it seems
12 March 2024 at 22:18
> To: dev@flink.apache.org
> Subject: [EXTERNAL] Additional metadata available for Kafka serdes
> Hello! Looking to get some guidance for a problem around the Flink formats
> used for Kafka.
>
> Flink currently uses common serdes interfaces across all forma
this approach will meet your needs,
Kind regards, David.
From: Balint Bene
Date: Tuesday, 12 March 2024 at 22:18
To: dev@flink.apache.org
Subject: [EXTERNAL] Additional metadata available for Kafka serdes
Hello! Looking to get some guidance for a problem around the Flink formats
used for Kafka
Hello! Looking to get some guidance for a problem around the Flink formats
used for Kafka.
Flink currently uses common serdes interfaces across all formats. However,
some data formats used in Kafka require headers for serdes. It's the same
problem for serialization and deserialization, so I'll ju