Thanks Fabian,
I ended up using something like below.
public class GenericSerializer implements
KafkaSerializationSchema {
private final SerializationSchema valueSerializer;
private final String topic;
public GenericSerializer(String topic, Schema schemaValue, String
schemaRegistryUrl) {
Thanks for sharing your solution Anil!
Cheers, Fabian
Am Di., 21. Apr. 2020 um 09:35 Uhr schrieb Anil K :
> Thanks Fabian,
>
> I ended up using something like below.
>
> public class GenericSerializer implements
> KafkaSerializationSchema {
>
> private final SerializationSchema valueSerialize
Hi Anil,
Here's a pointer to Flink's end-2-end test that's checking the integration
with schema registry [1].
It was recently updated so I hope it works the same way in Flink 1.9.
Best,
Fabian
[1]
https://github.com/apache/flink/blob/master/flink-end-to-end-tests/flink-confluent-schema-registry/
Hi,
What is the best way to use Confluent SchemaRegistry with
FlinkKafkaProducer?
What I have right now is as follows.
SerializationSchema serializationSchema =
ConfluentRegistryAvroSerializationSchema.forGeneric(topic, schema,
schemaRegistryUrl);
FlinkKafkaProducer kafkaProducer =
new