Hi,

for your point 3. you can look at `FlinkKafkaConsumerBase.setStartFromTimestamp(...)`.

Points 1. and 2. will not work with the well established `FlinkKafkaConsumer`. However, it should be possible to do it with the new `KafkaSource` that was introduced in Flink 1.12. It might be a bit rough around the edged, though.

With the `KafkaSource` you can specify `OffsetInitializers` for both the starting and stopping offset of the source. Take a look at `KafkaSource`, `KafkaSourceBuilder`, and `OffsetInitializers` in the code.

I hope this helps.

Best,
Aljoscha

On 2021/01/08 07:51, vinay.raic...@t-systems.com wrote:
Hi Flink Community Team,

This is a desperate request for your help on below.

I am new to the Flink and trying to use it with Kafka for Event-based data 
stream processing in my project. I am struggling using Flink to find solutions 
to my requirements of project below:


 1.  Get all Kafka topic records at a given time point 't' (now or in the 
past). Also how to pull latest-record only* from Kafka using Flink
 2.  Getting all records from Kafka for a given time interval in the past between 
t1 & t2 time period.
 3.  Continuously getting data from Kafka starting at a given time point (now 
or in the past). The client will actively cancel/close the data streaming. 
Examples: live dashboards. How to do it using Flink?
Please provide me sample "Flink code snippet" for pulling data from kafka for 
above three requirements and oblige. I am stuck for last one month without much progress 
and your timely help will be a savior for me!
Thanks & Regards,
Vinay Raichur
T-Systems India<https://www.t-systems.com/in/en> | Digital Solutions
Mail: vinay.raic...@t-systems.com<mailto:vinay.raic...@t-systems.com>
Mobile: +91 9739488992

Reply via email to