ertletTable.convertEx
>> pressionList(StandardConvertletTable.java:968)
>> ~[flink-table_2.11-1.3.1.jar:1.3.1]
>> at org.apache.calcite.sql2rel.StandardConvertletTable.convertCa
>> ll(StandardConvertletTable.java:944) ~[flink-table_2.11-1.3.1.jar:1.3.1]
>> at org.apache.c
gt; at org.apache.calcite.sql2rel.StandardConvertletTable.convertCall(
> StandardConvertletTable.java:928) ~[flink-table_2.11-1.3.1.jar:1.3.1]
> ... 50 common frames omitted
>
> Is there anyone tell me how to deal with it,thanks!
>
> -- 原始邮件 --
>
1.3.1.jar:1.3.1]
... 50 common frames omitted
Is there anyone tell me how to deal with it,thanks!
-- --
??: "Nico Kruber";;
: 2017??7??25??(??????) ????11:48
??????: "user";
: "";
: Re:
Please, for the sake of making your email searchable, do not post stack traces
as screenshots but rather text into your email.
On Tuesday, 25 July 2017 12:18:56 CEST 程骥 wrote:
> My sql like this(contain a Chinese word)
>
> Get exception when I submit the job to cluster.
>
>
>
> Is there anyon
Hi,
currently Flink does not support this charset in a LIKE expression. This
is due to a limitation in the Apache Calcite library. Maybe you can open
an issue there.
The easiest solution for this is to implement your own scalar function,
that does a `string.contains("")`.
Here you can
My sql like this(contain a Chinese word)
Get exception when I submit the job to cluster.
Is there anyone tell me how to deal with it,thanks!