one upstream batch and the next so the windows
> would remain open for that time.
>
>
>
> Thanks,
>
> Jonny
>
> *From:* miki haiat [mailto:miko5...@gmail.com]
> *Sent:* Monday, January 21, 2019 5:07 PM
> *To:* Jonny Graham
> *Cc:* user@flink.apache.org
> *Subject
ch and the next so the windows would remain open for that time.
Thanks,
Jonny
From: miki haiat [mailto:miko5...@gmail.com]
Sent: Monday, January 21, 2019 5:07 PM
To: Jonny Graham
Cc: user@flink.apache.org
Subject: Re: Kafka stream fed in batches throughout the day
In flink you cant read data from ka
In flink you cant read data from kafka in Dataset API (Batch)
And you dont want to mess with start and stop your job every few hours.
Can you elaborate more on your use case ,
Are you going to use KeyBy , is thire any way to use trigger ... ?
On Mon, Jan 21, 2019 at 4:43 PM Jonny Graham
wrote:
We have a Kafka stream of events that we want to process with a Flink
datastream process. However, the stream is populated by an upstream batch
process that only executes every few hours. So the stream has very 'bursty'
behaviour. We need a window based on event time to await the next events for