Hi, Shengnan YU ~

You can reference the test cases in FlinkDDLDataTypeTest[1] for a quick 
reference of what a DDL column type looks like.

[1] 
https://github.com/apache/flink/blob/a194b37d9b99a47174de9108a937f821816d61f5/flink-table/flink-sql-parser/src/test/java/org/apache/flink/sql/parser/FlinkDDLDataTypeTest.java#L165

Best,
Danny Chan
在 2019年8月15日 +0800 PM2:12,Shengnan YU <ysna...@hotmail.com>,写道:
>
> Hi guys
> I am trying the DDL feature in branch 1.9-releasae. I am stucked in creating 
> a table from kafka with nested json format. Is it possibe to specify a "Row" 
> type of columns to derive the nested json schema?
>
> String sql = "create table kafka_stream(\n" +
> " a varchar, \n" +
> " b varchar,\n" +
> " c int,\n" +
> " inner_json row\n" +
> ") with (\n" +
> " 'connector.type' ='kafka',\n" +
> " 'connector.version' = '0.11',\n" +
> " 'update-mode' = 'append', \n" +
> " 'connector.topic' = 'test',\n" +
> " 'connector.properties.0.key' = 'bootstrap.servers',\n" +
> " 'connector.properties.0.value' = 'localhost:9092',\n" +
> " 'format.type' = 'json', \n" +
> " 'format.derive-schema' = 'true'\n" +
> ")\n";
>
> Thank you very much!

Reply via email to