一直都木有数据 我也不知道哪里不太对 hive有这个表了已经。我测试写ddl hdfs 是OK的





------------------ 原始邮件 ------------------
发件人: JasonLee <[email protected]&gt;
发送时间: 2020年7月21日 20:39
收件人: user-zh <[email protected]&gt;
主题: 回复:flink-1.11 ddl kafka-to-hive问题



hi
hive表是一直没有数据还是过一段时间就有数据了?


| |
JasonLee
|
|
邮箱:[email protected]
|

Signature is customized by Netease Mail Master

在2020年07月21日 19:09,kcz 写道:
hive-1.2.1
chk 已经成功了(去chk目录查看了的确有chk数据,kafka也有数据),但是hive表没有数据,我是哪里缺少了什么吗?
String hiveSql = "CREATE&nbsp; TABLE&nbsp; stream_tmp.fs_table (\n" +
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; "&nbsp; host STRING,\n" +
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; "&nbsp; url STRING," +
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; "&nbsp; public_date STRING" +
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; ") partitioned by (public_date string) " +
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; "stored as PARQUET " +
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; "TBLPROPERTIES (\n" +
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; "&nbsp; 'sink.partition-commit.delay'='0 
s',\n" +
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; "&nbsp; 
'sink.partition-commit.trigger'='partition-time',\n" +
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; "&nbsp; 
'sink.partition-commit.policy.kind'='metastore,success-file'" +
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; ")";
tableEnv.executeSql(hiveSql);


tableEnv.executeSql("INSERT INTO&nbsp; stream_tmp.fs_table SELECT host, url, 
DATE_FORMAT(public_date, 'yyyy-MM-dd') FROM stream_tmp.source_table");

回复