Hi Maxim,
you could inject an AssignerWithPunctuatedWatermarks into your plan which
emits a watermark for every record it sees.
That way you can increment the logical time for every record.
Best, Fabian
2017-08-04 16:27 GMT+02:00 Maksym Parkachov :
> Hi,
>
> I'm evaluating Flink as alternative
Hi,
I'm evaluating Flink as alternative to Spark streaming for test project
reading from Kafka and saving to Cassandra. Everything works, but I'm
struggling with integration tests. I could not figure out how to manually
move time in Flink. Basically, I write message in Kafka with event time in
the