ighack created ZEPPELIN-5361:
--------------------------------

             Summary: java.lang.NoSuchMethodError: 
org.apache.thrift.protocol.TProtocol.getScheme()Ljava/lang/Class;
                 Key: ZEPPELIN-5361
                 URL: https://issues.apache.org/jira/browse/ZEPPELIN-5361
             Project: Zeppelin
          Issue Type: Bug
          Components: flink
    Affects Versions: 0.9.0
            Reporter: ighack


flink-1.12.3

CDH 6.3.2

hive 2.1.1

in $FLINK_HOME/lib I have 

hive-exec-2.1.1-cdh6.3.2.jar

flink-sql-connector-hive-2.2.0_2.11-1.12.3.jar

flink-hadoop-compatibility_2.11-1.12.3.jar

 

flink with hive 

in zeppelin, I enable hive 

I get a error 

 

org.apache.zeppelin.interpreter.InterpreterException: 
java.lang.NoSuchMethodError: 
org.apache.thrift.protocol.TProtocol.getScheme()Ljava/lang/Class; at 
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:76)
 at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:836)
 at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:744)
 at org.apache.zeppelin.scheduler.Job.run(Job.java:172) at 
org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:132)
 at 
org.apache.zeppelin.scheduler.FIFOScheduler.lambda$runJobInScheduler$0(FIFOScheduler.java:42)
 at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 
at java.lang.Thread.run(Thread.java:748) Caused by: 
java.lang.NoSuchMethodError: 
org.apache.thrift.protocol.TProtocol.getScheme()Ljava/lang/Class; at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_database_args.write(ThriftHiveMetastore.java:26561)
 at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:63) at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.send_get_database(ThriftHiveMetastore.java:764)
 at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_database(ThriftHiveMetastore.java:756)
 at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1519)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:498) at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:150)
 at com.sun.proxy.$Proxy42.getDatabase(Unknown Source) at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:498) at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient$SynchronizedHandler.invoke(HiveMetaStoreClient.java:2562)
 at com.sun.proxy.$Proxy43.getDatabase(Unknown Source) at 
org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.getDatabase(HiveMetastoreClientWrapper.java:119)
 at 
org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:382)
 at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:269) 
at 
org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:186)
 at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:351)
 at 
org.apache.zeppelin.flink.FlinkScalaInterpreter.registerHiveCatalog(FlinkScalaInterpreter.scala:461)
 at 
org.apache.zeppelin.flink.FlinkScalaInterpreter.open(FlinkScalaInterpreter.scala:134)
 at org.apache.zeppelin.flink.FlinkInterpreter.open(FlinkInterpreter.java:65) 
at 
org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
 ... 8 more



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to