Hi,

I am a new bee to flink and facing some challenges to solve below use case

Use Case description:

I will receive a csv file with a timestamp on every single day in some
folder say *input*.The file format would be
*file_name_dd-mm-yy-hh-mm-ss.csv*.

Now my flink pipeline will read this csv file in a row by row fashion and
it will be written to my Kafka topic.

Once the pipeline reads the entire file then this file needs to be moved to
another folder say *historic* so that i can keep *input * folder empty for
the new file.

I googled a lot but did not find anything so can you guide me to achieve
this.

Let me know if anything else is required.


Samir Vasani

Reply via email to