Re: flink 1.1.2 RichFunction not working

2016-09-27 Thread Chen Bekor
is an open pull request. > Once that is in, we can look into getting Flink 1.1.3 out. > > Best, > Stephan > > > > On Tue, Sep 27, 2016 at 8:17 PM, sunny patel wrote: > >> Hi Chen, >> >> Please upload your Flink scala library dependencies. >>

flink 1.1.2 RichFunction not working

2016-09-27 Thread Chen Bekor
Hi, Since upgrading to flink 1.1.2 from 1.0.2 - I'm experiencing regression in my code. In order to Isolate the issue I have written a small flink job that demonstrates that. The job does some time based window operations with an input csv file (in the example below - count the number of events o

[no subject]

2016-07-17 Thread Chen Bekor
Hi, I Need some assistance - I’m trying to globally register arguments from my main function for further extraction on stream processing nodes. My code base is Scala: val env = StreamExecutionEnvironment.getExecutionEnvironment val parameterTool = ParameterTool.fromArgs(args) env.getConfig.set

Re: design question

2016-04-24 Thread Chen Bekor
of the number of GUIDs that are eventually tracked. Whether > flink/stream processing is the most effective way to achieve your goal, I > can't say, but I am fairly confident that this particular aspect is not a > problem. > > On Sat, Apr 23, 2016 at 1:13 AM, Chen Bekor wrote:

design question

2016-04-23 Thread Chen Bekor
hi all, I have a stream of incoming object versions (objects change over time) and a requirement to fetch from a datastore the last known object version in order to link it with the id of the new version, so that I end up with a linked list of object versions. all object versions contain the sam

throttled stream

2016-04-16 Thread Chen Bekor
is there a way to consume a kafka stream using flink with a predefined rate limit (eg 5 events per second) we need this because we need to control some 3rd party api rate limitations so, even if we have a much larger throughput potential, we must control the consumption rate in order not to over

printing datastream to the log - instead of stdout

2016-04-14 Thread Chen Bekor
hi, the .print() method will print a dataset / datastream to the stdout. how can I print the stream to the standard logger (logback/log4j)? I'm using flink scala - so scala example code is much appreciated. p.s - I noticed that there's a PrintFunction that I can implement but it feels like I'm

flink streaming - window chaining example

2016-03-27 Thread Chen Bekor
hi all! I'm just starting my way with flink and I have a design question. I'm trying to aggregate incoming events (source: kafka topic) on a 10min tumbling window in order to calculate the incoming events rate (total per minute). I would like to take this window and perform an additional window