Hi,
Currently, Flink's windows are based on time (or a fixed number of
elements). I want to trigger window computation based on specific events
(marked within the data). In the DataStream API, this can be achieved using
GlobalWindow and custom triggers, but how can it be done in Flink SQL?
Additio
Sep 4, 2019 at 7:48 PM Wesley Peng
wrote:
> Hi
>
> on 2019/9/4 19:30, liu ze wrote:
> > I use the row_number() over() function to do topN, the total amount of
> > data is 60,000, and the state is 12G .
> > Finally, oom, is there any way to optimize it?
>
> ref
hi,
I use the row_number() over() function to do topN, the total amount of data
is 60,000, and the state is 12G .
Finally, oom, is there any way to optimize it?
thanks
Hi,
I want to update third-party system in the mapFunction ,does mapFunction need
to implement CheckpointedFunction?
For example, in the mapFunction I want to update mysql, do I need to implement
checkpointfunc, manage the state myself
stream=env.addSource()
stream.map(
"insert update mysq