Hi, 

As far as I know, the community currently has no plans to support custom 
triggers on Flink SQL, 

because it is difficult to describe triggers using SQL. 

You can create a jira[1] for it and restart the discussion in dev maillist.




[1] https://issues.apache.org/jira/projects/FLINK




--

    Best!
    Xuyang




At 2024-07-17 17:22:01, "liu ze" <liuze0...@gmail.com> wrote:

Hi,

Currently, Flink's windows are based on time (or a fixed number of elements). I 
want to trigger window computation based on specific events (marked within the 
data). In the DataStream API, this can be achieved using GlobalWindow and 
custom triggers, but how can it be done in Flink SQL? Additionally, it is 
necessary to ensure that the upstream and downstream windows process the same 
batch of data.

This is a common requirement. For example, we need to calculate some metrics 
for a user based on their user ID. This data does not have a time attribute 
field, so we can only determine that the data is complete and trigger window 
computation through some specially marked data. The computed results are then 
passed to the downstream via Kafka. The downstream might perform more 
coarse-grained calculations for this user. It is essential to ensure that the 
data is complete and exactly matches the data in the upstream window.

Can Flink SQL achieve this functionality?

Reply via email to