Hi Till,
Thank you for the reply.
1. The batch processing may be customized according to the usage scenario.
For our online batch jobs, we set the interval parameter to 8h.
2. For our usage scenario, we need the client to exit immediately when the
failed Container reaches MAXIMUM_WORKERS_FAILURE_R
Hi Till,
1) From Anyang's request, I think it is reasonable to use two parameters
for the rate as a batch job runs for a while. The failure rate in a small
interval is meaningless.
I think they need a failure count from the beginning as the failure
condition.
@Anyang Hu
2) In the current impleme
Hi Peter,
For our online batch task, there is a scene where the failed Container
reaches MAXIMUM_WORKERS_FAILURE_RATE but the client will not immediately
exit (the probability of JM loss is greatly improved when thousands of
Containers is to be started). It is found that the JM disconnection (the
Hi guys,
In flink1.9, is there a way to read local json file in Flink SQL like the
reading of csv file?
Now we can read local csv file like the following, replacing of 'csv' to
'json' can not work:
create table source (
first varchar,
id int
) with (
'connector.type' = 'filesystem',
'connector.p
On 2019/9/8 5:40 下午, Anyang Hu wrote:
In flink1.9, is there a way to read local json file in Flink SQL like
the reading of csv file?
hi,
might this thread help you?
http://mail-archives.apache.org/mod_mbox/flink-dev/201604.mbox/%3cCAK+0a_o5=c1_p3sylrhtznqbhplexpb7jg_oq-sptre2neo...@mail.gmail.
Hi Wesley,
This is not the way I want, I want to read local json data in Flink SQL by
defining DDL.
Best regards,
Anyang
Wesley Peng 于2019年9月8日周日 下午6:14写道:
> On 2019/9/8 5:40 下午, Anyang Hu wrote:
> > In flink1.9, is there a way to read local json file in Flink SQL like
> > the reading of csv f
Dear Community,
happy to share this "week's" community update, back after a three week
summer break. It's been a very busy time in the Flink community as a lot of
FLIP discussions and votes for Apache Flink 1.10 are on their way. I will
try to cover a good part of it in this update along with bugs