Hi Jark,

I saw this mail and found this is a similar issue I raised to the community 
several days ago.[1] Can you have a look to see if it’s the same issue as this.

If yes, there is a further question. From the Pravega connector side, the issue 
is raised in our Batch Table API which means users using the 
BatchTableEnvironment to create tables. Currently, BatchTableEnvironment does 
not support Blink planner. Any suggestions on how we can support Batch Tables 
in Flink 1.10?

[1] 
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Need-help-on-timestamp-type-conversion-for-Table-API-on-Pravega-Connector-td33660.html

Best Regards,
Brian

From: Jark Wu <imj...@gmail.com>
Sent: Thursday, March 19, 2020 17:14
To: Paul Lam
Cc: user
Subject: Re: SQL Timetamp types incompatible after migration to 1.10


[EXTERNAL EMAIL]
Hi Paul,

Are you using old planner? Did you try blink planner? I guess it maybe a bug in 
old planner which doesn't work well on new types.

Best,
Jark

On Thu, 19 Mar 2020 at 16:27, Paul Lam 
<paullin3...@gmail.com<mailto:paullin3...@gmail.com>> wrote:
Hi,

Recently I upgraded a simple application that inserts static data into a table 
from 1.9.0 to 1.10.0, and
encountered a timestamp type incompatibility problem during the table sink 
validation.

The SQL is like:
```
insert into kafka.test.tbl_a # schema: (user_name STRING, user_id INT, 
login_time TIMESTAMP)
select ("ann", 1000, TIMESTAMP "2019-12-30 00:00:00")
```

And the error thrown:
```
Field types of query result and registered TableSink `kafka`.`test`.`tbl_a` do 
not match.
Query result schema: [EXPR$0: String, EXPR$1: Integer, EXPR$2: Timestamp]
TableSink schema: [user_name: String, user_id: Integer, login_time: 
LocalDateTime]
```

After some digging, I found the root cause might be that since FLINK-14645 
timestamp fields
defined via TableFactory had been bridged to LocalDateTime, but timestamp 
literals are
still backed by java.sql.Timestamp.

Is my reasoning correct? And is there any workaround? Thanks a lot!

Best,
Paul Lam

Reply via email to