我使用FlinkCDC读取Mysql数据插入Doris,数据总数插入500条左右后会停止写入。
我尝试使用同样的方法写入Mysql,数据是正常的,排除Flink的问题。
希望得到大佬们帮助,谢谢!

Translation:
I use flinkcdc to read MySQL data and insert it into Doris. After about 500 
pieces of data are inserted, writing will stop.
I try to use the same method to write mysql, the data is normal, and the 
problem of Flink is excluded.
Hope to get your help, thank you!

doris table:
CREATE TABLE `ods_business_order` (
  `id` bigint(20) NOT NULL COMMENT '主键',
    ...
)
UNIQUE KEY(`id`)
DISTRIBUTED BY HASH(`id`) BUCKETS 10
PROPERTIES("replication_num" = "1");


flinksql sink table:
CREATE TABLE `doris_business_order` (
`id` bigint,
    ...
) WITH (
      'connector' = 'doris',
      'fenodes' = 'XXX',
      'table.identifier' = 'stage_order.ods_business_order',
      'username' = 'XXX',
      'password' = 'XXX',
      'sink.batch.size' = '1'
);


flinksql source table:
CREATE TABLE `mysql_business_order` (
  `id` bigint ,
    ...
) WITH (
  'connector' = 'mysql-cdc',
  'hostname' = 'XXX',
  'port' = '3306',
  'username' = 'XXX',
  'password' = 'XXX',
  'database-name' = 'mcs_order100000054',
  'table-name' = 'business_order2021[0-9]*'
);




maker_d...@foxmail.com

Reply via email to