Hello All,

I am getting many events in Kafka and I have written a link job that sinks
that Avro records from Kafka to S3 in parquet format.

Now, I want to sink these records into elastic search. but the only
challenge is that I want to sink record on time indices. Basically, In
Elastic, I want to create a per day index with the date as the suffix.
So in Flink stream job if I create an es sink how will I change the sink to
start writing  in a new index when the first event of the day arrives

Thanks,
Anuj.


<http://www.oracle.com/>


<http://www.cse.iitm.ac.in/%7Eanujjain/>

Reply via email to