gt; function would collect arriving elements and perform a lookup query after a
> certain number of elements arrived (can cause high latency if the arrival
> rate of elements is low or varies).
> The flatmap function can be executed in parallel and does not require a
> keyed stream.
>
s,
> Aljoscha
>
> On Sat, 23 Apr 2016 at 16:57 Konstantin Kulagin
> wrote:
>
>> I finally was able to do that. Kinda ugly, but works:
>>
>> https://gist.github.com/krolen/ed1344e4d7be5b2116061685268651f5
>>
>>
>>
>> On Fri, Apr 22, 2016 at
>
> For a finite data set, elements are also streamed through the topology.
> Only if you use operations that require grouping or sorting (such as
> groupBy/reduce and join) will elements be buffered in memory or on disk
> before they are processed.
>
> Two answer your last q
Hi guys,
I have some kind of general question in order to get more understanding of
stream vs final data transformation. More specific - I am trying to
understand 'entities' lifecycle during processing.
1) For example in a case of streams: suppose we start with some key-value
source, parallel it
I finally was able to do that. Kinda ugly, but works:
https://gist.github.com/krolen/ed1344e4d7be5b2116061685268651f5
On Fri, Apr 22, 2016 at 6:14 PM, Konstantin Kulagin
wrote:
> I was trying to implement this (force flink to handle all values from
> input) but had no success...
> P
I was trying to implement this (force flink to handle all values from
input) but had no success...
Probably I am not getting smth with flink windowing mechanism
I've created my 'finishing' trigger which is basically a copy of purging
trigger
But was not able to make it work:
https://gist.github.c
No problems at all, there is not much flink people and a lot of asking guys
- it should be hard to understand each person's issues :)
Yes, it is not as easy as 'contains' operator: I need to collect some
amount of tuples in order to create a in-memory lucene index. After that I
will filter entrie
Hi guys,
trying to run this example:
StreamExecutionEnvironment env =
StreamExecutionEnvironment.getExecutionEnvironment();
DataStreamSource> source = env.addSource(new
SourceFunction>() {
@Override
public void run(SourceContext> ctx) throws
Exception {
LongStream.rang