Hi Fabian,
Can you please answer my last set of questions I have posted on the Forum.
Thanks.
On Friday, August 4, 2017, Fabian Hueske-2 [via Apache Flink User Mailing
List archive.] wrote:
> TimeWindow.getStart() or TimeWindow.getEnd()
>
> -> https://ci.apache.org/projects/flink/flink-docs-
>
Thanks Fabian. I do have one more question.
When we connect the two streams and perfrom coprocess function. There are
two separate methods for each streams. Which stream state we need to store
and Will the coprocess function automatically trigger once the other stream
data or should we set some tim
TimeWindow.getStart() or TimeWindow.getEnd()
->
https://ci.apache.org/projects/flink/flink-docs-release-1.3/dev/windows.html#incremental-window-aggregation-with-reducefunction
2017-08-04 22:43 GMT+02:00 Raj Kumar :
> Thanks Fabian.
>
> The incoming events have the timestamps. Once I aggregate in
Thanks Fabian.
The incoming events have the timestamps. Once I aggregate in the first
stream to get counts and calculate the mean/standard deviation in the second
the new timestamps should be window start time ? How to tackle this issue ?
--
View this message in context:
http://apache-flink-
Hi Raj,
you have to combine two streams. The first stream has the running avg +
std-dev over the last 6 hours, the second stream has the 15 minute counts.
Both streams emit one record every 15 minutes. What you wan to do is to
join the two records of both streams with the same timestamp.
You do th
Thanks Fabian. Your suggestion helped. But, I am stuck at 3rd step
1. I didn't completely understand the step 3. What the process function
should look like ? Why does it needs to be stateful. Can you please provide
more details on this.
2. In the stateful, function, we need to have a value state
The average would be computed over the aggregated 15-minute count values.
The sliding window would emit every 15 minutes the average of all records
that arrived within the last 6 hours.
Since the preceding 15-minute tumbling window emits 1 record every 15 mins,
this would be the avg over 24 records
Thanks Fabian. That helps.
I have one more question. In the second step since I am using window
function apply, The average calculated will be a running average or it will
be computed at the end of 6hrs window ??
--
View this message in context:
http://apache-flink-user-mailing-list-archive.23
You can compute the average and std-dev in a WindowFunction that iterates
over all records in the window (6h / 15min = 24).
WIndowFunction [1] and CoProcessFunction [2] are described in the docs.
[1]
https://ci.apache.org/projects/flink/flink-docs-release-1.4/dev/windows.html#windowfunction---the-
Thanks Fabian.
Can you provide more details about the implementation for step 2 and step 3.
How to calculate the average and standard deviation ?
How does the coprocess function work ? Can you provide details about these
two.
--
View this message in context:
http://apache-flink-user-mailing-
Hi,
I would first compute the 15 minute counts. Based on these counts, you
compute the threshold by computing average and std-dev and then you compare
the counts with the threshold.
In pseudo code this could look as follows:
DataStream requests = ...
DataStream counts = requests.timeWindow(15 min
11 matches
Mail list logo