rmahindra123 commented on a change in pull request #3660:
URL: https://github.com/apache/hudi/pull/3660#discussion_r755465220



##########
File path: 
hudi-kafka-connect/src/main/java/org/apache/hudi/connect/utils/KafkaConnectUtils.java
##########
@@ -85,9 +87,15 @@ public static int getLatestNumPartitions(String 
bootstrapServers, String topicNa
    *
    * @return
    */
-  public static Configuration getDefaultHadoopConf() {
+  public static Configuration getDefaultHadoopConf(KafkaConnectConfigs 
connectConfigs) {
     Configuration hadoopConf = new Configuration();
-    hadoopConf.set("fs.file.impl", 
org.apache.hadoop.fs.LocalFileSystem.class.getName());
+    connectConfigs.getProps().keySet().stream().filter(prop -> {
+      // In order to prevent printing unnecessary warn logs, here filter out 
the hoodie
+      // configuration items before passing to hadoop/hive configs
+      return !prop.toString().startsWith(HOODIE_CONF_PREFIX);
+    }).forEach(prop -> {
+      hadoopConf.set(prop.toString(), 
connectConfigs.getProps().get(prop.toString()).toString());
+    });

Review comment:
       I could, but i have to convert the kafka map configs to hadoop configs. 
also later we can change this logic if required. for instance, if we want to 
just have hadoop confs start with "conf.hadoop." in kafka connect, we can do 
that independently. wdyt?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to