Re: Flink 1.9.2 why always checkpoint expired

2020-04-27 Thread qq
Hi Jiayi Liao. Thanks your replying. Add attachment . And can’t get any useful messages; > 2020年4月27日 12:40,Jiayi Liao 写道: > > <粘贴的图形-1.tiff>

Flink 1.9.2 why always checkpoint expired

2020-04-26 Thread qq
Hi all, Why my flink checkpoint always expired, I used RocksDB checkpoint, and I can’t get any useful messages for this. Could you help me ? Thanks very much.

Connect RocksDB which created by Flink checkpoint

2019-12-30 Thread qq
Hi all. How can I connect RocksDB which created by Flink checkpoint, aim to check the rocksdb configuration and data in rocksdb. Thanks very much. AlexFu

Get consumer group offset

2019-12-24 Thread qq
Hi all, I use Kafka 0.10.0, Flink 1.9.0, why I can’t get flink consumer10 group which I had configured. And I use KafkaConsumer not with Flink to consumer the same topic, I can get the consumer group metadata. Thanks. Kafka/bin/kafka-run-class kafka.admin.ConsumerGroupCommand --bootstrap-se

How to understand create watermark for Kafka partitions

2019-12-12 Thread qq
Hi all, I confused with watermark for each Kafka partitions. As I know watermark created by data stream level. But why also say created watermark for each Kafka topic partitions ? As I tested, watermarks also created by global, even I run my job with parallels. And assign watermarks on

Flink on Yarn resource arrangement

2019-11-13 Thread qq
Hi all, Could you list details how Flink job on Yarn resources managed ? I used command “-p 20 -yn 5 -ys 3 -yjm 2048m -ytm 2048m” to run flink job. I got containers vcores 8 22 Task Managers 7 Total Task Slots 21 I used command “-p 20 -yn 7 -ys 4 -yjm 2048m -ytm 2048m” to

Flink savepoint(checkpoint) recovery dev debug

2019-11-06 Thread qq
Hi all. I want to simulation the shell command which “flink -s savepoint” , this command only can run with shell command, I want to debug it on dev, local development environment, anyone could help me ? Thanks very much. I only can use Savepoint.load to read the savepoint metadata and data.

dynamic add sink to flink

2017-06-09 Thread qq
Hi: we use flink as a router to our kafka, we read from one kafka and to a lot of diffrent kafka and topic, but it will cost too much time to start the flink job if we want to add some other kafka sink to the flink, so if there any way to dynamic add sink to flink or just start the flink j