RE: Additional metadata available for Kafka serdes

2024-03-20 Thread David Radley
: dev@flink.apache.org Subject: [EXTERNAL] Re: Additional metadata available for Kafka serdes Hi David! I think passing the headers as a map (as opposed to ConsumerRecord/ProducerRecord) is a great idea that should work. That way the core Flink package doesn't have Kafka dependencies, it seems

Re: Additional metadata available for Kafka serdes

2024-03-14 Thread Balint Bene
12 March 2024 at 22:18 > To: dev@flink.apache.org > Subject: [EXTERNAL] Additional metadata available for Kafka serdes > Hello! Looking to get some guidance for a problem around the Flink formats > used for Kafka. > > Flink currently uses common serdes interfaces across all forma

Re: Additional metadata available for Kafka serdes

2024-03-14 Thread David Radley
this approach will meet your needs, Kind regards, David. From: Balint Bene Date: Tuesday, 12 March 2024 at 22:18 To: dev@flink.apache.org Subject: [EXTERNAL] Additional metadata available for Kafka serdes Hello! Looking to get some guidance for a problem around the Flink formats used for Kafka

Additional metadata available for Kafka serdes

2024-03-12 Thread Balint Bene
Hello! Looking to get some guidance for a problem around the Flink formats used for Kafka. Flink currently uses common serdes interfaces across all formats. However, some data formats used in Kafka require headers for serdes. It's the same problem for serialization and deserialization, so I'll ju