CREATE TABLE user_log(
    `id` INT,
    `timestamp`  BIGINT
) WITH (
    'connector.type' = 'kafka',
    'connector.version' = 'universal',
    'connector.topic' = 'wanglei_jsontest',
    'connector.startup-mode' = 'latest-offset',
    'connector.properties.0.key' = 'zookeeper.connect',
    'connector.properties.0.value' = '172.19.78.32:2181',
    'connector.properties.1.key' = 'bootstrap.servers',
    'connector.properties.1.value' = '172.19.78.32:9092',
    'update-mode' = 'append',
    'format.type' = 'json',
    'format.json-schema' = '{
        "type": "object",
        "properties": {
           "id": {"type": "integer"},
           "timestamp": {"type": "number"}
        }
    }'
);

Then select * from user_log;

org.apache.flink.table.api.ValidationException: Type INT of table field 'id' 
does not match with the physical type LEGACY('DECIMAL', 'DECIMAL') of the 'id' 
field of the TableSource return type.

Seems the specified type "integer", "number" can not be mapped to  INT, BIGINT 

How can i solve this problem?

Thanks,
Lei



wangl...@geekplus.com.cn 

Reply via email to