[ 
https://issues.apache.org/jira/browse/FLINK-5571?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15853765#comment-15853765
 ] 

ASF GitHub Bot commented on FLINK-5571:
---------------------------------------

Github user godfreyhe commented on a diff in the pull request:

    https://github.com/apache/flink/pull/3176#discussion_r99548048
  
    --- Diff: 
flink-libraries/flink-table/src/test/scala/org/apache/flink/table/api/scala/batch/sql/UserDefinedTableFunctionITCase.scala
 ---
    @@ -0,0 +1,106 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one
    + * or more contributor license agreements.  See the NOTICE file
    + * distributed with this work for additional information
    + * regarding copyright ownership.  The ASF licenses this file
    + * to you under the Apache License, Version 2.0 (the
    + * "License"); you may not use this file except in compliance
    + * with the License.  You may obtain a copy of the License at
    + *
    + *     http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.flink.table.api.scala.batch.sql
    +
    +import org.apache.flink.api.scala.util.CollectionDataSets
    +import org.apache.flink.api.scala.{ExecutionEnvironment, _}
    +import org.apache.flink.table.api.TableEnvironment
    +import org.apache.flink.table.api.scala._
    +import org.apache.flink.table.api.scala.batch.utils.UDFTestUtils
    +import org.apache.flink.table.utils.{RichTableFunc0, RichTableFunc1}
    +import org.apache.flink.test.util.TestBaseUtils
    +import org.apache.flink.types.Row
    +import org.junit.Test
    +
    +import scala.collection.JavaConverters._
    +
    +class UserDefinedTableFunctionITCase {
    +
    +  @Test
    +  def testOpenClose(): Unit = {
    +    val env = ExecutionEnvironment.getExecutionEnvironment
    +    val tEnv = TableEnvironment.getTableEnvironment(env)
    +    tEnv.registerFunction("RichTableFunc0", new RichTableFunc0)
    +
    +    val sqlQuery = "SELECT a, s FROM t1, LATERAL TABLE(RichTableFunc0(c)) 
as T(s)"
    +
    +    val ds = CollectionDataSets.get3TupleDataSet(env)
    +    tEnv.registerDataSet("t1", ds, 'a, 'b, 'c)
    +
    +    val result = tEnv.sql(sqlQuery)
    +
    +    val expected =
    +      "1,Hi\n2,Hello\n3,Hello world\n4,Hello world, how are you?\n5,I am 
fine.\n6,Luke Skywalker"
    +    val results = result.toDataSet[Row].collect()
    +    TestBaseUtils.compareResultAsText(results.asJava, expected)
    +  }
    +
    +  @Test
    +  def testSingleUDTFWithParameter(): Unit = {
    +    val env = ExecutionEnvironment.getExecutionEnvironment
    +    val tEnv = TableEnvironment.getTableEnvironment(env)
    +    tEnv.registerFunction("RichTableFunc1", new RichTableFunc1)
    +    UDFTestUtils.setJobParameters(env, Map("word_separator" -> " "))
    +
    +    val sqlQuery = "SELECT a, s FROM t1, LATERAL TABLE(RichTableFunc1(c)) 
as T(s)"
    +
    +    val ds = CollectionDataSets.getSmall3TupleDataSet(env)
    +    tEnv.registerDataSet("t1", ds, 'a, 'b, 'c)
    +
    +    val result = tEnv.sql(sqlQuery)
    +
    +    val expected = "3,Hello\n3,world"
    +    val results = result.toDataSet[Row].collect()
    +    TestBaseUtils.compareResultAsText(results.asJava, expected)
    +  }
    +
    +  @Test
    +  def testMultiUDTFs(): Unit = {
    --- End diff --
    
    yes, each `RichTableFunction` will generate independent FlatMap function. 
And I think this test is also meaningful.  I will add cases to test UDTF with 
UDF later.


> add open and close methods for UserDefinedFunction in TableAPI & SQL
> --------------------------------------------------------------------
>
>                 Key: FLINK-5571
>                 URL: https://issues.apache.org/jira/browse/FLINK-5571
>             Project: Flink
>          Issue Type: New Feature
>          Components: Table API & SQL
>            Reporter: godfrey he
>            Assignee: godfrey he
>
> Currently, a User Defined Function (UDF) in table API & SQL works on zero, 
> one, or multiple values in custom evaluation method. Many UDFs need more 
> complex features, e.g. report metrics, get parameters from job configuration, 
> or get extra data from distribute cache file, etc. Adding open and close 
> methods in UserDefinedFunction class can solve this problem. The code cloud 
> look like:
> {code}
> trait UserDefinedFunction {
>   def open(context: UDFContext): Unit = {}
>   def close(): Unit = {}
> }
> {code}
> UDFContext contains the information about metric reporters, job parameters, 
> distribute cache, etc. The code cloud look like:
> {code}
> class UDFContext(context: RuntimeContext) {
>   def getMetricGroup: MetricGroup = ???
>   def getDistributedCacheFile(name: String): File = ???
>   def getJobParameter(key: String, default: String): String = ???
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Reply via email to