[ https://issues.apache.org/jira/browse/FLINK-1525?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14537696#comment-14537696 ]
ASF GitHub Bot commented on FLINK-1525: --------------------------------------- Github user rmetzger commented on the pull request: https://github.com/apache/flink/pull/664#issuecomment-100805130 In Hadoop, the `UserConfig` would probably be a `Configuration` object with key/value pairs. In Flink, we are trying to get rid of these untyped maps. Instead, I would recommend users to use a simple java class, like ```java public static class MyConfig extends UserConfig { public long someLongValue; public int someInt; // this is optional public Map<String, String> toMap() { return null; } } ``` It can be used in a similar way to a Configuration object, but the compiler is able to check the types. The `ParameterUtil` is implementing the UserConfig interface to expose the configuration values through the Flink program & in the web interface. > Provide utils to pass -D parameters to UDFs > -------------------------------------------- > > Key: FLINK-1525 > URL: https://issues.apache.org/jira/browse/FLINK-1525 > Project: Flink > Issue Type: Improvement > Components: flink-contrib > Reporter: Robert Metzger > Labels: starter > > Hadoop users are used to setting job configuration through "-D" on the > command line. > Right now, Flink users have to manually parse command line arguments and pass > them to the methods. > It would be nice to provide a standard args parser with is taking care of > such stuff. -- This message was sent by Atlassian JIRA (v6.3.4#6332)