eFunction, WindowFunction) or
> apply(FoldFunction, WindowFunction). These allow incremental aggregation of
> the result as elements arrive and don't require buffering of all elements
> until the window fires.
>
> Cheers,
> Aljoscha
>
> On Thu, 21 Apr 2016 at 16:53 Kostya
using the snapshot version, I would suggest you don't use it if you
> don't absolutely need one of the features in there that is not yet
> released. The build are still pretty stable, however.
>
> Cheers,
> Aljoscha
>
> On Thu, 21 Apr 2016 at 13:53 Kostya Kulagin wrote
e behavior by writing a custom Trigger that behaves
> like the count trigger but also fires when receiving a Long.MAX_VALUE
> watermark. A watermark of Long.MAX_VALUE signifies that a source has
> stopped processing for natural reasons.
>
> Cheers,
> Aljoscha
>
> On Th
r
> both on the count or after a long-enough timeout. It would be a combination
> of CountTrigger and EventTimeTrigger (or ProcessingTimeTrigger) so you
> could look to those to get started.
>
> Cheers,
> Aljoscha
>
> On Wed, 20 Apr 2016 at 23:44 Kostya Kulagin wrote:
>
&
rgument of LongStream.range(start, end) is exclusive)
>
> Cheers,
> Aljoscha
>
>
>
> On Thu, 21 Apr 2016 at 11:44 Kostya Kulagin wrote:
>
>> Actually this is not true - the source emits 30 values since it is
>> started with 0. If I change 29 to 33 result will be the sa
ts each.
> The last window is never triggered.
>
> Cheers,
> Aljoscha
>
> On Wed, 20 Apr 2016 at 23:52 Kostya Kulagin wrote:
>
>> I think it has smth to do with parallelism and I probably do not have
>> clear understanding how parallelism works in flink but in th
I think it has smth to do with parallelism and I probably do not have clear
understanding how parallelism works in flink but in this example:
StreamExecutionEnvironment env =
StreamExecutionEnvironment.getExecutionEnvironment();
DataStreamSource source = env.addSource(new SourceFunction()
I have a pretty big but final stream and I need to be able to window it by
number of elements.
In this case from my observations flink can 'skip' the latest chunk of data
if it has lower amount of elements than window size:
StreamExecutionEnvironment env =
StreamExecutionEnvironment.getExecuti