Sure:
/--> AsyncIO --\
STREAM --> ProcessFunc -- -- Union -- WindowFunc
\--/
ProcessFunc keeps track of the unique keys per window duration and emits
each
For each key I need to call an external REST service to get the current
status and this is why I'd like to use Async IO. At the moment I do this in
a process function but I'd like a cleaner solution (if possible).
Do you think your proposal of forking could be a better option?
Could you provide a s
OK, I see. What information will be send out via the async request?
Maybe you can fork of a separate stream with the info that needs to be send
to the external service and later union the result with the main stream
before the window operator?
Am Di., 23. Juli 2019 um 14:12 Uhr schrieb Flavio Po
The problem of bundling all records together within a window is that this
solution doesn't scale (in the case of large time windows and number of
events)..my requirement could be fulfilled by a keyed ProcessFunction but I
think AsyncDataStream should provide a first-class support to keyed streams
(
Hi Flavio,
Not sure I understood the requirements correctly.
Couldn't you just collect and bundle all records with a regular window
operator and forward one record for each key-window to an AsyncIO operator?
Best, Fabian
Am Do., 18. Juli 2019 um 12:20 Uhr schrieb Flavio Pompermaier <
pomperma...
Hi to all,
I'm trying to exploit async IO in my Flink job.
In my use case I use keyed tumbling windows and I'd like to execute the
async action only once per key and window (while
the AsyncDataStream.unorderedWait execute the async call for every element
of my stream) ..is there an easy way to do t