Re: Flink 1.10 es sink exception

2020-02-16 Thread Leonard Xu
Hi, sunfulin Using constant key in `group by` query is not usual and inefficient, you can get around this bug by bubbling up your constant key in `group by` from now. BTW,godfrey is ready to resolve issue. > 在 2020年2月17日,10:15,sunfulin 写道: > > Hi, > WOW,really thankful for the track and

Re:Re: Flink 1.10 es sink exception

2020-02-16 Thread sunfulin
Hi, WOW,really thankful for the track and debug of this problem. I can see the constant key issue. Appreciate for your kindly help : ) At 2020-02-15 21:06:58, "Leonard Xu" wrote: Hi, sunfulin I reproduce your case,this should be a bug in extracting unique key from plan and I create

Re: Flink 1.10 es sink exception

2020-02-15 Thread Leonard Xu
Hi, sunfulin I reproduce your case,this should be a bug in extracting unique key from plan and I create an issue[1] to trace this. CC: jark [1]https://issues.apache.org/jira/browse/FLINK-16070 > 在 2020年2月14日,23:39,sunfulin 写道: > > Hi,

Re:Re: Flink 1.10 es sink exception

2020-02-14 Thread sunfulin
Hi, Jark Appreciate for your reply. insert with column list indeed is not allowed with old planner enabled in Flink 1.10 while it will throws exception such as "Partial insert is not supported". Never mind for this issue. Focus on the UpsertMode exception, my es DDL is like the following: CR

Re: Flink 1.10 es sink exception

2020-02-14 Thread Jark Wu
Hi sunfulin, Is this the real query you submit? AFAIK, insert with column list is not allowed for now, i.e. the `INSERT INTO ES6_ZHANGLE_OUTPUT(aggId, pageId, ts, expoCnt, clkCnt)`. Could you attach the full SQL text, including DDLs of ES6_ZHANGLE_OUTPUT table and kafka_zl_etrack_event_stream t

Flink 1.10 es sink exception

2020-02-13 Thread sunfulin
Hi, guys When running the same Flink sql like the following, I met exception like "org.apache.flink.table.api.TableException: UpsertStreamTableSink requires that Table has a full primary keys if it is updated". I am using the latest Flink 1.10 release with blink planner enabled. Because the same