luoyuxia commented on code in PR #25258:
URL: https://github.com/apache/flink/pull/25258#discussion_r1830600593


##########
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/catalog/hive/HiveCatalog.java:
##########
@@ -226,6 +243,37 @@ public HiveCatalog(
     }
 
     public static HiveConf createHiveConf(
+            @Nullable String hiveConfDir,
+            @Nullable String hadoopConfDir,
+            @Nullable ReadableConfig flinkConfiguration) {
+        HiveConf hiveconf = initHiveConf(hiveConfDir, hadoopConfDir);
+        // add all configuration key with prefix 'flink.hive.hadoop.' and 
'flink.hive.' in flink
+        // conf to hive conf
+        String hivePrefix = FLINK_HIVE_CONFIG_PREFIXES;
+        String hadoopPrefix = "hadoop.";
+        if (flinkConfiguration != null) {
+            for (String key : flinkConfiguration.toMap().keySet()) {
+                if (key.startsWith(hivePrefix)) {
+                    String newKey = key.substring(hivePrefix.length());
+                    if (newKey.startsWith(hadoopPrefix)) {
+                        newKey = newKey.substring(hadoopPrefix.length());
+                    }
+                    String value =
+                            flinkConfiguration.get(
+                                    
ConfigOptions.key(key).stringType().noDefaultValue());
+                    hiveconf.set(newKey, value);
+                    LOG.debug(
+                            "Adding Flink config entry for {} as {}={} to Hive 
config",
+                            key,
+                            newKey,
+                            value);
+                }
+            }
+        }
+        return hiveconf;
+    }
+
+    public static HiveConf initHiveConf(

Review Comment:
   nit: We can still name the method `createHiveConf` 



##########
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/catalog/hive/HiveCatalog.java:
##########
@@ -226,6 +243,37 @@ public HiveCatalog(
     }
 
     public static HiveConf createHiveConf(
+            @Nullable String hiveConfDir,
+            @Nullable String hadoopConfDir,
+            @Nullable ReadableConfig flinkConfiguration) {
+        HiveConf hiveconf = initHiveConf(hiveConfDir, hadoopConfDir);
+        // add all configuration key with prefix 'flink.hive.hadoop.' and 
'flink.hive.' in flink
+        // conf to hive conf
+        String hivePrefix = FLINK_HIVE_CONFIG_PREFIXES;
+        String hadoopPrefix = "hadoop.";
+        if (flinkConfiguration != null) {
+            for (String key : flinkConfiguration.toMap().keySet()) {
+                if (key.startsWith(hivePrefix)) {
+                    String newKey = key.substring(hivePrefix.length());
+                    if (newKey.startsWith(hadoopPrefix)) {

Review Comment:
   Why do we need to check starts with hadoopPrefix again? Can we just put to 
hive conf once it starts with `flink.hive.`?



##########
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/catalog/hive/HiveCatalog.java:
##########
@@ -226,6 +243,37 @@ public HiveCatalog(
     }
 
     public static HiveConf createHiveConf(
+            @Nullable String hiveConfDir,
+            @Nullable String hadoopConfDir,
+            @Nullable ReadableConfig flinkConfiguration) {
+        HiveConf hiveconf = initHiveConf(hiveConfDir, hadoopConfDir);
+        // add all configuration key with prefix 'flink.hive.hadoop.' and 
'flink.hive.' in flink
+        // conf to hive conf
+        String hivePrefix = FLINK_HIVE_CONFIG_PREFIXES;
+        String hadoopPrefix = "hadoop.";
+        if (flinkConfiguration != null) {
+            for (String key : flinkConfiguration.toMap().keySet()) {

Review Comment:
   nit: use flinkConfiguration.toMap().entrySet()



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to