Hi Chengcheng Zhang,
  
          Is this your scene? For example, every day is divided into 12 hours, 
let’s take today as an example, 2020081600 2020081601,...2020081623
For example, if we count pv, we can count like this
INSERT INTO cumulative_pv
SELECT time_str, count(1)
FROM pv_per_hour
GROUP BY time_str;
In this sql, time_str is an hour in 2020081600, 2020081601,...2020081623.


[1]http://apache-flink.147419.n8.nabble.com/flink-sql-5-5-td2011.html
[2]http://wuchong.me/blog/2020/02/25/demo-building-real-time-application-with-flink-sql/


Hope this helps.


Best, forideal








At 2020-08-16 12:05:04, "Chengcheng Zhang" <274522...@qq.com> wrote:

Hi,
I'm a new user of Flink, and have been puzzled a lot by the time-based window 
aggregation result. 
For our business, hourly and daily reports have to been created best in a real 
time style.  So, I used a event-time based window aggregation to consume the 
Kafka  data stream, but found that, only after the current hour or day passed, 
the newest result could be seen on console or upserted to MySQL. 
How can I get the latest window result immediately after a stream record 
falling into it? Is there a specific configuration option for this, hopefully? 
Please help and rescue me.
Best regards.

Reply via email to