This already sounds awfully complicated. Is there no other way to
implement the delta windows?

On Wed, Jun 3, 2015 at 7:52 AM, Gyula Fóra <gyula.f...@gmail.com> wrote:
> Hi Ufuk,
>
> In the concrete use case I have in mind I only want to send events to
> another subtask of the same task vertex.
>
> Specifically: if we want to do distributed delta based windows we need to
> send after every trigger the element that has triggered the current window.
> So practically I want to broadcast some event regularly to all subtasks of
> the same operator.
>
> In this case the operators would wait until they receive this event so we
> need to make sure that this event sending is not blocked by the actual
> records.
>
> Gyula
>
> On Tuesday, June 2, 2015, Ufuk Celebi <u...@apache.org> wrote:
>
>>
>> On 02 Jun 2015, at 22:45, Gyula Fóra <gyf...@apache.org <javascript:;>>
>> wrote:
>> > I am wondering, what is the suggested way to send some events directly to
>> > another parallel instance in a flink job? For example from one mapper to
>> > another mapper (of the same operator).
>> >
>> > Do we have any internal support for this? The first thing that we thought
>> > of is iterations but that is clearly an overkill.
>>
>> There is no support for this at the moment. Any parallel instance? Or a
>> subtask instance of the same task?
>>
>> Can you provide more input on the use case? It is certainly possible to
>> add support for this.
>>
>> If the events don't need to be inline with the records, we can easily
>> setup the TaskEventDispatcher as a separate actor (or extend the task
>> manager) to process both backwards flowing events and in general any events
>> that don't need to be inline with the records. The task deployment
>> descriptors need to be extended with the extra parallel instance
>> information.
>>
>> – Ufuk

Reply via email to