Re: Forking a stream with Flink

2019-01-31 Thread Dawid Wysakowicz
Hi Daniel, The answer to you original question is you can just keyBy[1] by e.g. the machineId and then computations on KeyedStream are applied independently for each key. Best, Dawid [1] https://ci.apache.org/projects/flink/flink-docs-release-1.7/dev/stream/operators/#datastream-transformations

Re: Forking a stream with Flink

2019-01-31 Thread Daniel Krenn
I don't get what happened here. Did Selvaraj just hijack this question? Or what is going on? Am Di., 29. Jan. 2019 um 17:01 Uhr schrieb Selvaraj chennappan < selvarajchennap...@gmail.com>: > I think there is misunderstanding . I want to compare raw json and > transformed record . > Hence I need t

Re: Forking a stream with Flink

2019-01-29 Thread Selvaraj chennappan
I think there is misunderstanding . I want to compare raw json and transformed record . Hence I need two consumer and merge the stream for comparison. I have pipeline defined . pipeline does source(kafka) ,transformation,dedup and persisting to DB . [image: image.png] Before reaching to DB task l

Re: Forking a stream with Flink

2019-01-29 Thread Puneet Kinra
Hi Selvaraj In your pojo add data member as status or something like that,now set it error in case it is invaild .pass the output of flatmap to split opertor there you can split the stream On Tue, Jan 29, 2019 at 6:39 PM Selvaraj chennappan < selvarajchennap...@gmail.com> wrote: > UseCase:- We h

Re: Forking a stream with Flink

2019-01-29 Thread Selvaraj chennappan
UseCase:- We have kafka consumer to read messages(json ) then it applies to flatmap for transformation based on the rules ( rules are complex ) and convert it to pojo . We want to verify the record(pojo) is valid by checking field by field of that record .if record is invalid due to transformation

Re: Forking a stream with Flink

2019-01-29 Thread miki haiat
Im not sure if i got your question correctly, can you elaborate more on your use case