gaogao110 opened a new issue, #5623:
URL: https://github.com/apache/hudi/issues/5623

   **Describe the problem you faced**
   1. write to hudi sync hive 
   2. if hive table already exists will throw an exception
   
   
   **To Reproduce**
   
   1. write to hudi sync hive 
   2. if hive table already exists will throw an exception
   
   **Environment Description**
   
   * Hudi version : 0.10.1
   
   * Flink version : 1.13.6
   
   * Hive version : 3.1.2
   
   * Hadoop version : 3.1.3
   
   * Running on Docker? (yes/no) : no
   
   **Stacktrace**
   
   `2022-04-21 14:03:12,628 INFO  
org.apache.hudi.org.apache.hadoop.hive.metastore.HiveMetaStoreClient [] - 
Closed a connection to metastore, current connections: 0
   2022-04-21 14:03:12,628 ERROR 
org.apache.hudi.sink.StreamWriteOperatorCoordinator          [] - Executor 
executes action [sync hive metadata for instant 20220421140236703] error
   java.lang.NoSuchMethodError: 
org.apache.hadoop.hive.serde2.thrift.Type.getType(Lorg/apache/hudi/org/apache/hive/service/rpc/thrift/TTypeId;)Lorg/apache/hadoop/hive/serde2/thrift/Type;
        at 
org.apache.hudi.org.apache.hive.service.cli.TypeDescriptor.<init>(TypeDescriptor.java:47)
 ~[hudi-flink-bundle_2.11-0.10.1.jar:0.10.1]
        at 
org.apache.hudi.org.apache.hive.service.cli.ColumnDescriptor.<init>(ColumnDescriptor.java:46)
 ~[hudi-flink-bundle_2.11-0.10.1.jar:0.10.1]
        at 
org.apache.hudi.org.apache.hive.service.cli.TableSchema.<init>(TableSchema.java:46)
 ~[hudi-flink-bundle_2.11-0.10.1.jar:0.10.1]
        at 
org.apache.hudi.org.apache.hive.jdbc.HiveQueryResultSet.retrieveSchema(HiveQueryResultSet.java:264)
 ~[hudi-flink-bundle_2.11-0.10.1.jar:0.10.1]
        at 
org.apache.hudi.org.apache.hive.jdbc.HiveQueryResultSet.<init>(HiveQueryResultSet.java:198)
 ~[hudi-flink-bundle_2.11-0.10.1.jar:0.10.1]
        at 
org.apache.hudi.org.apache.hive.jdbc.HiveQueryResultSet$Builder.build(HiveQueryResultSet.java:179)
 ~[hudi-flink-bundle_2.11-0.10.1.jar:0.10.1]
        at 
org.apache.hudi.org.apache.hive.jdbc.HiveDatabaseMetaData.getColumns(HiveDatabaseMetaData.java:278)
 ~[hudi-flink-bundle_2.11-0.10.1.jar:0.10.1]
        at 
org.apache.hudi.hive.ddl.JDBCExecutor.getTableSchema(JDBCExecutor.java:129) 
~[hudi-flink-bundle_2.11-0.10.1.jar:0.10.1]
        at 
org.apache.hudi.hive.HoodieHiveClient.getTableSchema(HoodieHiveClient.java:225) 
~[hudi-flink-bundle_2.11-0.10.1.jar:0.10.1]
        at org.apache.hudi.hive.HiveSyncTool.syncSchema(HiveSyncTool.java:248) 
~[hudi-flink-bundle_2.11-0.10.1.jar:0.10.1]
        at 
org.apache.hudi.hive.HiveSyncTool.syncHoodieTable(HiveSyncTool.java:184) 
~[hudi-flink-bundle_2.11-0.10.1.jar:0.10.1]
        at org.apache.hudi.hive.HiveSyncTool.doSync(HiveSyncTool.java:129) 
~[hudi-flink-bundle_2.11-0.10.1.jar:0.10.1]
        at 
org.apache.hudi.hive.HiveSyncTool.syncHoodieTable(HiveSyncTool.java:115) 
~[hudi-flink-bundle_2.11-0.10.1.jar:0.10.1]
        at 
org.apache.hudi.sink.StreamWriteOperatorCoordinator.syncHive(StreamWriteOperatorCoordinator.java:302)
 ~[hudi-flink-bundle_2.11-0.10.1.jar:0.10.1]
        at 
org.apache.hudi.sink.utils.NonThrownExecutor.lambda$execute$0(NonThrownExecutor.java:93)
 ~[hudi-flink-bundle_2.11-0.10.1.jar:0.10.1]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 
[?:1.8.0_202]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 
[?:1.8.0_202]
        at java.lang.Thread.run(Thread.java:748) [?:1.8.0_202]
   2022-04-21 14:03:12,629 INFO  org.apache.hudi.client.AbstractHoodieClient    
              [] - Stopping Timeline service !!
   2022-04-21 14:03:12,629 INFO  
org.apache.hudi.client.embedded.EmbeddedTimelineService      [] - Closing 
Timeline server
   2022-04-21 14:03:12,630 INFO  
org.apache.hudi.timeline.service.TimelineService             [] - Closing 
Timeline Service`


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to