[ https://issues.apache.org/jira/browse/FLINK-6196?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15978698#comment-15978698 ]
ASF GitHub Bot commented on FLINK-6196: --------------------------------------- Github user fhueske commented on a diff in the pull request: https://github.com/apache/flink/pull/3623#discussion_r112680743 --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/table/runtime/dataset/DataSetUserDefinedFunctionITCase.scala --- @@ -117,6 +117,187 @@ class DataSetUserDefinedFunctionITCase( } @Test + def testDynamicSchema(): Unit = { --- End diff -- Also, these are many integration tests which all run a full job. Aren't `TableTestBase` sufficient to check if the plan is correctly translated? IMO, we should have only few integration tests and more unit tests based on `TableTestBase`. What do you think? > Support dynamic schema in Table Function > ---------------------------------------- > > Key: FLINK-6196 > URL: https://issues.apache.org/jira/browse/FLINK-6196 > Project: Flink > Issue Type: Improvement > Components: Table API & SQL > Reporter: Zhuoluo Yang > Assignee: Zhuoluo Yang > > In many of our use cases. We have to decide the schema of a UDTF at the run > time. For example. udtf('c1, c2, c3') will generate three columns for a > lateral view. > Most systems such as calcite and hive support this feature. However, the > current implementation of flink didn't implement the feature correctly. -- This message was sent by Atlassian JIRA (v6.3.15#6346)