[ https://issues.apache.org/jira/browse/FLINK-13699?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
ASF GitHub Bot updated FLINK-13699: ----------------------------------- Labels: pull-request-available (was: ) > Fix TableFactory doesn't work with DDL when containing TIMESTAMP/DATE/TIME > types > -------------------------------------------------------------------------------- > > Key: FLINK-13699 > URL: https://issues.apache.org/jira/browse/FLINK-13699 > Project: Flink > Issue Type: Bug > Components: Table SQL / API, Table SQL / Planner > Affects Versions: 1.9.0 > Reporter: Jark Wu > Assignee: Jark Wu > Priority: Critical > Labels: pull-request-available > > Currently, in blink planner, we will convert DDL to {{TableSchema}} with new > type system, i.e. DataTypes.TIMESTAMP()/DATE()/TIME() whose underlying > TypeInformation are Types.LOCAL_DATETIME/LOCAL_DATE/LOCAL_TIME. > However, this makes the existing connector implementations (Kafka, ES, CSV, > etc..) don't work because they only accept the old TypeInformations > (Types.SQL_TIMESTAMP/SQL_DATE/SQL_TIME). > A simple solution is encode DataTypes.TIMESTAMP() as "TIMESTAMP" when > translating to properties. And will be converted back to the old > TypeInformation: Types.SQL_TIMESTAMP. This would fix all factories at once. -- This message was sent by Atlassian JIRA (v7.6.14#76016)