Hi!
Just to avoid confusion: the DataStream network readers does currently not
support backpressuring only one input (as this conflicts with other design
aspects). (The DataSet network readers do support that FYI)
How about simply "correcting" the order later? If you have pre-sorted data
per stre
Hi Stephan,
In one of our components we have to process events in order, due to
business logic requirements.
That is for sure introduces a bottleneck, but other aspects are fine.
I'm not taking about really resorting data, but just about consuming it in
the right order.
I.e. if two streams are al
Hi Dmitry!
The streaming runtime makes a conscious decision to not merge streams as in
an ordered merge.
The reason is that this is at large scale typically bad for scalability /
network performance.
Also, in certain DAGs, it may lead to deadlocks.
Even the two input operator delivers records on
Hi Timo,
I don't have any key to join on, so I'm not sure Window Join would work for
me.
Can I implement my own "low level" operator in any way?
I would appreciate if you can give me a hint or a link to example of how to
do it.
Best regards,
Dmitry
On Tue, Jan 17, 2017 at 9:24 AM, Timo Walthe
Hi Dmitry,
the runtime supports an arbitrary number of inputs, however, the API
does currently not provide a convenient way. You could use the "union"
operator to reduce the number of inputs. Otherwise I think you have to
implement your own operator. That depends on your use case though.
You
Hi,
there are only *two *interfaces defined at the moment:
*OneInputStreamOperator*
and
*TwoInputStreamOperator.*
Is there any way to define an operator with arbitrary number of inputs?
My another concern is how to maintain *backpressure *in the operator?
Let's say I read events from two Kafka s