Hi,
We encountered similar issues that the task manager kept being killed by
yarn.
- flink 1.9.1
- heap usage is low.
But our job is a **streaming** job, so I want to ask if this issue is only
related to **batch** job or not? Thanks!
Best,
Victor
yingjie 于2019年11月28日周四 上午11:43写道:
> Piotr is
Hi Lu,
You can check out which operator thread causes the high CPU usage, and set a
unique slot sharing group name [1] to it to prevent too many operator threads
running in the same TM.
Hope this will be helpful😊
[1].
https://ci.apache.org/projects/flink/flink-docs-stable/dev/stream/operators/
Hi,
“uid” is mainly useful when you upgrade your application. It’s used to match
the operator state stored in the savepoint.
As suggested in [1], “it is highly recommended to assign unique IDs to all
operators of an application that might be upgraded in the future.”
[1].
https://ci.apache
Hi Vishwas,
Since `DruidStreamJob` is an “object” of scala, and the initialization of your
sds client is not within any method, it will be called every time `
DruidStreamJob` is loaded (like static block in Java).
Your taskmanagers are different JVM processes, and ` DruidStreamJob` needs to
be
Hi Biao,
Thanks for your reply, I will give it a try (1.8+)!
Best,
Victor
From: Biao Liu
Date: Friday, August 9, 2019 at 5:45 PM
To: Victor Wong
Cc: "user@flink.apache.org"
Subject: Re: **RegistrationTimeoutException** after TaskExecutor successfully
registered at resource m
Hi,
I’m using Flink version 1.7.1, and I encountered this exception which was a
little weird from my point of view;
TaskManager successfully registered at resource manager, however after 5
minutes (which is the default value of taskmanager.registration.timeout config)
it threw out RegistrationTi
Hi Shakir,
> Delayed Processing
Maybe you can make use of the function
‘org.apache.flink.streaming.api.TimerService#registerProcessingTimeTimer’,
check this doc for more details:
https://ci.apache.org/projects/flink/flink-docs-stable/dev/stream/operators/process_function.html#example
> Rate Lim
Hi Andres,
I’d like to share my thoughts:
When you register a “Table”, you need to specify its “schema”, so how can you
register the table when the number of elements/columns and data types are both
nondeterministic.
Correct me if I misunderstood your meaning.
Best,
Victor
From: Andres
Hi Yuta,
> I made sure the 'ValueState data' has data from stream1 with the IDE
debugger but in spite of that, processElement2 can't access it.
Since `processElement1` and `processElement2` use the same `Context`, I think
there is no state access issue.
Is it possible stream1 and stream2 don't