Hi,
Instead of a Join, I would suggest to use a connected FlatMap [1] (or a 
connected ProcessFunction [2]). The problem with a join is that the rules only 
“survive” for the length of the window while I suspect that you want them to 
survive longer than that so that they can be applied to events arriving in the 
future.

Best,
Aljoscha 

[1] 
https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/datastream_api.html#datastream-transformations
 
<https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/datastream_api.html#datastream-transformations>
[2] 
https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/stream/process_function.html#low-level-joins
 
<https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/stream/process_function.html#low-level-joins>

> On 2. May 2017, at 18:58, Tarek khal <tarek.khal.leta...@gmail.com> wrote:
> 
> I have two kafka topics (tracking and rules) and I would like to join
> "tracking" datastream with "rules" datastream as the data arrives in the
> "tracking" datastream. 
> 
> 
> Here the result that I expect, but without restarting the Job, here I
> restarted the Job to get this result: 
> 
> <http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/file/n12954/Capture.jpg>
>  
> 
> Code:
> 
> 
> 
> 
> 
> --
> View this message in context: 
> http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Join-two-kafka-topics-tp12954.html
> Sent from the Apache Flink User Mailing List archive. mailing list archive at 
> Nabble.com.

Reply via email to