nd
> description, such as RowType and StructuredType.
>
> Thanks,
> Kaka Chen
>
> Caizhi Weng 于2019年7月16日周二 下午11:16写道:
>
> > Hi Kaka and Jark,
> >
> > On a side note, `RowTypeInfo` only compares field types in its `equals`
> > method. I think
Hi Caizhi and Jark,
I think you are correct, from the quick view for source code, it should
only compares field types in the equals method.
Currently some composite logical row type has compared name and
description, such as RowType and StructuredType.
Thanks,
Kaka Chen
Caizhi Weng 于2019年7月16日
Hi Caizhi and Mark,
I think you are correct, from the quick view for source code, it should
only compares field types in the equals method.
Currently some composite logical row type has compared name and
description, such as RowType and StructuredType.
Thanks,
Kaka Chen
Caizhi Weng 于2019年7月16日
Hi Jark,
Thanks!
Thanks,
Kaka Chen
Jark Wu 于2019年7月16日周二 下午10:30写道:
> Hi Kaka,
>
> Thanks for reporting this. We didn't cover integrate tests for connectors
> yet because of FLINK-13276. We will cover that after FLINK-13276 is fixed.
>
> The problem you raised m
not match requested" +*
* s" type. Requested: $requestedTypeInfo; Actual:
$fieldTypeInfo")*
*}*
}
...
Thanks,
Frank
kaka chen 于2019年7月16日周二 下午5:23写道:
> Hi All,
>
>
> We are trying to switch to blink table planner in HBase connector, which
Integer); Actual: Row(EXPR$0:
Integer)
at
org.apache.flink.addons.hbase.HBaseSinkITCase.testTableSink(HBaseSinkITCase.java:140)
The original flink table planner executed successfully.
Thanks,
Kaka Chen
Hi Jingsong:
Thanks very much.
Thanks,
Kaka Chen
JingsongLee 于2019年7月16日周二 下午5:03写道:
> Hi kaka:
> Yeah, there are still some problems, I create a JIRA[1] to trace it, I
> think we should solve it before 1.9 release.
>
> [1] https://issues.apache.org/jira/browse/FLINK
.
Thanks,
Kaka Chen
Danny Chan 于2019年7月16日周二 下午4:53写道:
> How do you use the Kafka connector, through the TableEnvironment#sqlUpdate
> (the DDL), or through the TableEnvironment#connect() (the table api) ?
>
> Best,
> Danny Chan
> 在 2019年7月16日 +0800 PM4:41,kaka chen ,写道:
> >
Hi All,
We are trying to switch to blink table planner in Kafka's connector, but
found there is no SchemaValidator and related classes, how to resolve it?
Thanks.
Thanks,
Kaka Chen
Hi Yun,
Thanks you for your clarification.
Thanks,
Kaka Chen
Yun Tang 于2019年7月5日周五 下午3:34写道:
> Hi kaka
>
> You're correct, these comments are not correct for RocksDBMapState now,
> will correct it with a hotfix.
>
> Best
> Yun Tang
> _
Hi All,
I noticed RocksDBListState and RocksDBMapState have both the following
comments:
* {@link RocksDBStateBackend} must ensure that we set the
* {@link org.rocksdb.StringAppendOperator} on the column family that
we use for our state since
* we use the {@code merge()} call.
However, from the
ll remove blink's own TableEnvironment and use API's
> TableEnvironment and your problem will be fixed.
> Ideally, we can finish this at the end of this week.
>
> Best,
> Jark
>
> On Thu, 4 Jul 2019 at 19:20, kaka chen wrote:
>
> > Hi All:
> >
> > We found
Is it normal? Could someone help, thanks.
Thanks,
Kaka Chen
13 matches
Mail list logo