RE: Release 1.2?

2017-01-23 Thread denis.dollfus
Ok, thanks for the link. This makes sense. I don’t know why I had in mind that running Flink on Windows implied using the hadoop1 flavor of Flink – that’s very likely unrelated, please correct me if I’m wrong. Regards, Denis From: Greg Hogan [mailto:c...@greghogan.com] Sent: lundi 23 janvier

RE: Release 1.2?

2017-01-23 Thread denis.dollfus
I notice that for Flink 1.2.0 there is no x.y.z-hadoop1 folder (Cf. Apache staging repo). This is very useful to me for developing on Windows. Is it temporary, for 1.2 RCs only? Regards, Denis From: Do

RE: Release 1.2?

2017-01-17 Thread denis.dollfus
Thanks for the quick update, sounds perfect! And I‘ll try the staging repo trick in pom.xml. Denis From: Stephan Ewen [mailto:se...@apache.org] Sent: mardi 17 janvier 2017 11:42 To: user@flink.apache.org Subject: Re: Release 1.2? So far it looks as if the next release candidate comes in a few da

Release 1.2?

2017-01-17 Thread denis.dollfus
Hi all, Do you have some ballpark estimate for a stable release of Flink 1.2? We are still at a proof-of-concept stage and are interested in several features of 1.2, notably async stream operations (FLINK-4391). Thank you, Denis This e-mail is for the sole use

RE: Equivalent of Rx combineLatest() on a join?

2016-12-13 Thread denis.dollfus
Thanks Gábor, indeed it appears to work as expected. I found another way based on new evictors included in flink 1.2 (see FLINK-4174) that can remove elements anywhere in a window, for example based on element content. However the CoFlatMap solution you suggest is definitely simpler, I'm going

RE: Equivalent of Rx combineLatest() on a join?

2016-12-05 Thread denis.dollfus
Actually that doesn't work as expected because emitted values are not purged. I'll experiment with purging triggers and/or evictors, though I have the feeling that Flink was not designed for what we need to do here -- but I'll keep on searching. In the meantime any advice is appreciated. If the

RE: Equivalent of Rx combineLatest() on a join?

2016-12-05 Thread denis.dollfus
Asking the response helped me to find the answer (yes, rubber duck debugging) as it seems that the code below does what I need: s3 = s1.join(s2) .where(new KeySelector1()).equalTo(new KeySelector2()) .window(GlobalWindow.create

Equivalent of Rx combineLatest() on a join?

2016-12-05 Thread denis.dollfus
Hi all, [first email here, I'm new to Flink, Java and Scala, sorry if I missed something obvious] I'm exploring Flink in the context of streaming calculators. Basically, the data flow boils down to multiple data streams with variable update rates (ms, seconds, ..., month) which are joined befo