Hi Tim,
The way session windows work is by first creating a new window for every
incoming event and then merging overlapping windows. That's why you see
that the end time of a window increases with every new incoming event. I
hope this explains what you are seeing. Apart from that, I think the
Ses
Thanks! I've managed to implement a working solution with the trigger API,
but I'm not exactly sure why it works.
I'm doing the following:
DataStream summaries = env
.addSource(kafkaConsumer, "playerEvents(Kafka)")
.name("EP - Read player events from Kafka")
.uid("EP - Read
If you use the Trigger API, then you don't have to do anything special for
fault tolerance. When using the ProcessFunction, then you should use
Flink's state primitives to store your state (e.g. ValueState). This will
automatically checkpoint the state. In case of a failure Flink will always
resume
Thanks for the suggestions! I'll see if I can implement something that
works!
A followup question, more related to state. If I implement either the
custom trigger with or the process function, how will they handle crashes
and such. So if I for instance have a checkpointing interval of 10s will
the
Hi Tim,
I think you could use Flink's trigger API [1] to implement a trigger which
fires when it sees a certain event or after some time.
[1]
https://ci.apache.org/projects/flink/flink-docs-stable/dev/stream/operators/windows.html#triggers
Cheers,
Till
On Wed, Apr 28, 2021 at 5:25 PM Tim Josefs
Hello!
I'm trying to figure out how to implement a window that will emit events at
regular intervals or when a specific event is encountered.
A bit of background. I have a stream of events from devices that will send
events to our system whenever a user watches a video. These events include
a uni