Best,
Danny Chan
-- 转发信息 --
发件人: Danny Chan
日期: 2020年7月20日 +0800 PM4:51
收件人: Dongwon Kim
主题: Re: [Table API] how to configure a nested timestamp field
> Or is it possible you pre-define a catalog there and register through the SQL
> CLI yaml ?
>
> Best,
> Danny
Hi Leonard,
Unfortunately the answer is no, the YAML you defined will parse by Table
> API and then execute, the root cause of your post error is Table API does
> not support computed column now,
> there is a FLIP under discussion[1], this should be ready in 1.12.0. BTW,
> I think DDL is recommend
Hi, Kim
> Hi Leonard,
>
> Can I have a YAML definition corresponding to the DDL you suggested?
Unfortunately the answer is no, the YAML you defined will parse by Table API
and then execute, the root cause of your post error is Table API does not
support computed column now,
there is a FLIP u
Hi Leonard,
Can I have a YAML definition corresponding to the DDL you suggested?
I tried below (Flink 1.11.0) but got some error:
> tables:
> - name: test
> type: source-table
> update-mode: append
> connector:
> property-version: 1
> type: kafka
> version: univer
Hi Leonard,
Wow, that's great! It works like a charm.
I've never considered this approach at all.
Thanks a lot.
Best,
Dongwon
On Mon, Jul 6, 2020 at 11:26 AM Leonard Xu wrote:
> Hi, Kim
>
> The reason your attempts (2) and (3) failed is that the json format does
> not support convert a BIGINT
Hi, Kim
The reason your attempts (2) and (3) failed is that the json format does not
support convert a BIGINT to TIMESTAMP, you can first define the BIGINT field
and then use a computed column to extract TIMESTAMP field, you can also define
the time attribute on TIMESTAMP filed for using time-b
Hi,
I use Flink 1.10.1 and I want to use Table API to read JSON messages. The
message looks like below.
> {
>"type":"Update",
>"location":{
> "id":"123e4567-e89b-12d3-a456-42665234",
> "lastUpdateTime":1593866161436
>}
> }
I wrote the follo