Re: Regarding dividing the streams using keyby

2016-11-20 Thread Fabian Hueske
Hi, the result of a window operation on a KeyedStream is a regular DataStream. So, you would need to call keyBy() on the result again if you'd like to have a KeyedStream. You can also key a stream by two or more attributes: DataStream> windowedStream = jsonToTuple.keyBy(0,1,3) /

IterativeStream seems to ignore maxWaitTimeMillis

2016-11-20 Thread Juan Rodríguez Hortalá
Hi, I wrote a proof of concept for a Java version of mapWithState with time-based state eviction https://github.com/juanrh/flink-state-eviction/blob/a6bb0d4ca0908d2f4350209a4a41e381e99c76c5/src/main/java/com/github/juanrh/streaming/MapWithStateIterPoC.java. The idea is: - Convert an input KeyedS

Re: Failure to donwload flink-contrib dependency

2016-11-20 Thread Juan Rodríguez Hortalá
Hi Andrey, You are totally right, it worked with the following dependency. org.apache.flink flink-streaming-contrib_2.10 ${flink.version} My knowledge of maven is quite limited, thanks a lot for your help! Greetings, Juan On Sun, Nov 20, 2016 at 12:58 PM, Andrey Melentyev < andrey

Regarding dividing the streams using keyby

2016-11-20 Thread Abdul Salam Shaikh
I am trying to creating a keyed stream which collects events in a window using : DataStream> windowedStream = jsonToTuple.keyBy(0) .window(GlobalWindows.create()) .trigger(new WindowCustomTrigger())

Re: Many operations cause StackOverflowError with AWS EMR YARN cluster

2016-11-20 Thread Geoffrey Mon
Hello, I know that the reuse of the data set in my plan is causing the problem (after one dictionary atom is learned using the data set "S", "S" is updated for use with the next dictionary atom). When I comment out the line updating the data set "S", I have no problem and the plan processing phase

Re: Failure to donwload flink-contrib dependency

2016-11-20 Thread Andrey Melentyev
Hi Juan, flink-contrib itself is just a parent project for a number of others like flink-connector-wikiedits, flink-statebackend-rocksdb, etc. Therefore flink-contrib has it's packaging set to "pom" and no jar is published. Depending on which part of flink-contrib you want, you should add a more s

Failure to donwload flink-contrib dependency

2016-11-20 Thread Juan Rodríguez Hortalá
Hi, I'm having problems to download flink-contrib in my Java maven project, the relevant part of the pom is: UTF-8 1.1.3 org.apache.flink flink-contrib ${flink.version} I see that in https://repo1.maven.org/maven2/org/apache/flink/flink-contrib/1.1.3/ there are no jar files,