Hi,

I am learning to Flink. With Flink 1.7.1, trying to read from Kafka and
insert to ElasticSearch. I have a kafka connector convert the data to a
Flink table. In order to insert into Elasticsearch, I have converted this
table to a datastream, in order to be able to use the ElasticSearchSink.
But the Row returned by the streams, have lost the schema. How do i convert
this to JSON before calling the Elasticsearch sink connector. Any help or
suggestions would be appreciated.

Thanks.

Reply via email to