termark advancement. See [2] for a solution.
>
> [1]
> https://github.com/apache/flink/blob/release-1.11/flink-streaming-java/src/main/java/org/apache/flink/streaming/api/operators/ProcessOperator.java#L72-L72
> [2]
> https://ci.apache.org/projects/flink/flink-docs-release-1.11/dev/event
ook in a different direction: Could it be that
> your Sink/Async IO is not processing data (fast enough)?
> Since you have a bounded watermark strategy, you'd need to see 10s of data
> being processed before the first watermark is emitted.
> To test that, can you please simply re
Flink 1.11
I have a simple Flink application that reads from Kafka, uses event
timestamps, assigns timestamps and watermarks and then key's by a field and
uses a KeyedProcessFunciton.
The keyed process function outputs events from with the `processElement`
method using `out.collect`. No timers are
Is it possible to dynamically, as the flink application is running, inject
new SQL to be executed against a stream?
Thank you!
>
gt;>
>
> Yes, in the upcoming StateFun release we are introducing exactly that :)
> As of the upcoming version, we are adding a capability to dynamically
> register and modify function types without the need to redeploy Flink.
>
> Igal.
>
> On Mon, Jan 11, 2021 at 10:48 PM A
Hi,
I see that you need to tell the Flink Stateful runtime about remote
stateful function modules via a yaml file provided at deploy time. Given
remote modules and stateful functions are an external deployment concern
anyway, Is it possible to dynamically associate Remote Modules with Remote
Functi