[ https://issues.apache.org/jira/browse/FLINK-4883?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15653466#comment-15653466 ]
ASF GitHub Bot commented on FLINK-4883: --------------------------------------- Github user Renkai commented on a diff in the pull request: https://github.com/apache/flink/pull/2729#discussion_r87350332 --- Diff: flink-java/src/main/java/org/apache/flink/api/java/DataSet.java --- @@ -181,7 +181,17 @@ protected void fillInType(TypeInformation<T> typeInfo) { return this.type; } + /** + * 1. Check if the function is implemented by a scala object. Checks only if scala object function forbidden + * is not disabled in the [[org.apache.flink.api.common.ExecutionConfig]] + * + * 2. Returns a "closure-cleaned" version of the given function. Cleans only if closure cleaning + * is not disabled in the [[org.apache.flink.api.common.ExecutionConfig]] + */ public <F> F clean(F f) { --- End diff -- Thanks for review,I tried to add a new API to DataSet, but I got a compile error, I think some maven plugin prevent me to change the API, could you tell me how to be able to pass the compile and API check? > Prevent UDFs implementations through Scala singleton objects > ------------------------------------------------------------ > > Key: FLINK-4883 > URL: https://issues.apache.org/jira/browse/FLINK-4883 > Project: Flink > Issue Type: Bug > Reporter: Stefan Richter > Assignee: Renkai Ge > > Currently, user can create and use UDFs in Scala like this: > {code} > object FlatMapper extends RichCoFlatMapFunction[Long, String, (Long, String)] > { > ... > } > {code} > However, this leads to problems as the UDF is now a singleton that Flink > could use across several operator instances, which leads to job failures. We > should detect and prevent the usage of singleton UDFs. -- This message was sent by Atlassian JIRA (v6.3.4#6332)