YUZHOU HONG created FLINK-10982: ----------------------------------- Summary: Test DataStream to Table for Flink 1.6.2 Key: FLINK-10982 URL: https://issues.apache.org/jira/browse/FLINK-10982 Project: Flink Issue Type: Test Components: Table API & SQL Affects Versions: 1.6.2 Environment: jdk1.8, flink1.6, macOS 10.13 Reporter: YUZHOU HONG
I am a newcomer for Flink Table API & SQL. When I reference official doc to test a demo that converts two DataStream into a Table, then union all them, the system reports an exception called "Exception in thread "main" java.lang.NoSuchFieldError: DOT". By the way: The part of the official doc I read is [https://ci.apache.org/projects/flink/flink-docs-release-1.6/dev/table/common.html] here are my codes and exception stack trace. {code:java} import org.apache.flink.api.java.tuple.Tuple2; import org.apache.flink.streaming.api.TimeCharacteristic; import org.apache.flink.streaming.api.datastream.DataStream; import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment; import org.apache.flink.table.api.Table; import org.apache.flink.table.api.TableEnvironment; import org.apache.flink.table.api.java.StreamTableEnvironment; import org.apache.flink.types.Row; public class StreamToSql { public static void main(String[] args) throws Exception { final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment(); StreamTableEnvironment tEnv = TableEnvironment.getTableEnvironment(env); env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime); env.setParallelism(1); DataStream<Tuple2<Integer, String>> stream1 = env.fromElements(new Tuple2<>(1, "hello")); DataStream<Tuple2<Integer, String>> stream2 = env.fromElements(new Tuple2<>(1, "hello")); // stream1.print(); Table table1 = tEnv.fromDataStream(stream1, "count, word"); Table table2 = tEnv.fromDataStream(stream2, "count, word"); Table table = table1 .where("LIKE(word, 'F%')") .unionAll(table2); DataStream<Row> res = tEnv.toAppendStream(table, Row.class); res.print(); env.execute("StreamToSql"); } } {code} {code:java} SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/Users/yizhou/.m2/repository/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/Users/yizhou/.m2/repository/org/apache/logging/log4j/log4j-slf4j-impl/2.6.2/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/Users/yizhou/.m2/repository/ch/qos/logback/logback-classic/1.1.3/logback-classic-1.1.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] Exception in thread "main" java.lang.NoSuchFieldError: DOT at org.apache.flink.table.validate.BasicOperatorTable.<init>(FunctionCatalog.scala:316) at org.apache.flink.table.validate.FunctionCatalog.getSqlOperatorTable(FunctionCatalog.scala:58) at org.apache.flink.table.api.TableEnvironment.getSqlOperatorTable(TableEnvironment.scala:129) at org.apache.flink.table.api.TableEnvironment.frameworkConfig$lzycompute(TableEnvironment.scala:92) at org.apache.flink.table.api.TableEnvironment.frameworkConfig(TableEnvironment.scala:86) at org.apache.flink.table.api.TableEnvironment.relBuilder$lzycompute(TableEnvironment.scala:98) at org.apache.flink.table.api.TableEnvironment.relBuilder(TableEnvironment.scala:98) at org.apache.flink.table.api.TableEnvironment.typeFactory$lzycompute(TableEnvironment.scala:103) at org.apache.flink.table.api.TableEnvironment.typeFactory(TableEnvironment.scala:103) at org.apache.flink.table.api.TableEnvironment.scanInternal(TableEnvironment.scala:564) at org.apache.flink.table.api.TableEnvironment.scan(TableEnvironment.scala:519) at org.apache.flink.table.api.java.StreamTableEnvironment.fromDataStream(StreamTableEnvironment.scala:89) at realtime.stream.StreamToSql.main(StreamToSql.java:25) Process finished with exit code 1 {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005)