Hello,

I would like to compute statistics on a stream every hour. For that, I need to 
compute statistics on the keyed stream, then to reaggregate them.
I’ve tried the following thing :

stream.keyBy(mykey)
            .window(1 hour process time)
            .aggregate(my per-key aggregate)

            .windowAll(1 hour process time) // not the same window, add one 
hour delay…

            .reduce(fully aggregate intermediary results)
             ... then sink

This works, but I get the first line in the sink 2 hours after the first item 
in the sink, and 1 hour after it should be possible to get it.

My question: How to I trigger the reduce step immediately after the first 
aggregation ?

Best regards,
Arnaud



________________________________

L'intégrité de ce message n'étant pas assurée sur internet, la société 
expéditrice ne peut être tenue responsable de son contenu ni de ses pièces 
jointes. Toute utilisation ou diffusion non autorisée est interdite. Si vous 
n'êtes pas destinataire de ce message, merci de le détruire et d'avertir 
l'expéditeur.

The integrity of this message cannot be guaranteed on the Internet. The company 
that sent this message cannot therefore be held liable for its content nor 
attachments. Any unauthorized use or dissemination is prohibited. If you are 
not the intended recipient of this message, then please delete it and notify 
the sender.

Reply via email to