Hi Shengnan,

Yes. Flink 1.9 supports nested json derived. You should declare the ROW
type with nested schema explicitly. I tested a similar DDL against 1.9.0
RC2 and worked well.

CREATE TABLE kafka_json_source (
    rowtime VARCHAR,
    user_name VARCHAR,
    event ROW<message_type VARCHAR, message VARCHAR>
) WITH (
    'connector.type' = 'kafka',
    'connector.version' = 'universal',
    'connector.topic' = 'test-json',
    'connector.startup-mode' = 'earliest-offset',
    'connector.properties.0.key' = 'zookeeper.connect',
    'connector.properties.0.value' = 'localhost:2181',
    'connector.properties.1.key' = 'bootstrap.servers',
    'connector.properties.1.value' = 'localhost:9092',
    'update-mode' = 'append',
    'format.type' = 'json',
    'format.derive-schema' = 'true'
);

The kafka message is

{"rowtime": "2018-03-12T08:00:00Z", "user_name": "Alice", "event": {
"message_type": "WARNING", "message": "This is a warning."}}


Thanks,
Jark


On Thu, 15 Aug 2019 at 14:12, Shengnan YU <ysna...@hotmail.com> wrote:

>
> Hi guys
> I am trying the DDL feature in branch 1.9-releasae.  I am stucked in
> creating a table from kafka with nested json format. Is it possibe to
> specify a "Row" type of columns to derive the nested json schema?
>
> String sql = "create table kafka_stream(\n" +
>                 "  a varchar, \n" +
>                 "  b varchar,\n" +
>                 "  c int,\n" +
>                 "  inner_json row\n" +
>                 ") with (\n" +
>                 "  'connector.type' ='kafka',\n" +
>                 "  'connector.version' = '0.11',\n" +
>                 "  'update-mode' = 'append', \n" +
>                 "  'connector.topic' = 'test',\n" +
>                 "  'connector.properties.0.key' = 'bootstrap.servers',\n" +
>                 "  'connector.properties.0.value' = 'localhost:9092',\n" +
>                 "  'format.type' = 'json', \n" +
>                 "  'format.derive-schema' = 'true'\n" +
>                 ")\n";
>
>  Thank you very much!
>

Reply via email to