narendra maru created ZEPPELIN-2138:
---------------------------------------

             Summary: HiveMetaStore.java[createDefaultDB]:622)
                 Key: ZEPPELIN-2138
                 URL: https://issues.apache.org/jira/browse/ZEPPELIN-2138
             Project: Zeppelin
          Issue Type: Bug
          Components: zeppelin-interpreter, zeppelin-server
    Affects Versions: 0.7.0
         Environment: Ubuntu 12.0.4, hadoop 2.6.0, spark 1.6.0, hive 
1.2.1,zeppelin 0.7
            Reporter: narendra maru


fail to create hive context from zeppelin server
its successfully creating hive table and have context from spark 1.6.0 using 
thrift server


following is the error from log file 



 INFO [2017-02-20 12:06:12,004] ({pool-2-thread-5} 
SchedulerFactory.java[jobStarted]:131) - Job remoteInterpretJob_1487592372000 
started by scheduler org.apache.zeppelin.spark.SparkInterpreter1426006139
 INFO [2017-02-20 12:06:18,359] ({pool-2-thread-5} 
SchedulerFactory.java[jobFinished]:137) - Job remoteInterpretJob_1487592372000 
finished by scheduler org.apache.zeppelin.spark.SparkInterpreter1426006139
 INFO [2017-02-20 12:06:19,939] ({pool-2-thread-4} 
SchedulerFactory.java[jobStarted]:131) - Job remoteInterpretJob_1487592379922 
started by scheduler org.apache.zeppelin.spark.SparkInterpreter1426006139
 INFO [2017-02-20 12:06:22,448] ({pool-2-thread-4} Logging.scala[logInfo]:58) - 
Initializing execution hive, version 1.2.1
 INFO [2017-02-20 12:06:22,455] ({pool-2-thread-4} Logging.scala[logInfo]:58) - 
Inspected Hadoop version: 2.6.0
 INFO [2017-02-20 12:06:22,456] ({pool-2-thread-4} Logging.scala[logInfo]:58) - 
Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0
 INFO [2017-02-20 12:06:23,656] ({pool-2-thread-4} 
SessionState.java[printInfo]:951) - ivysettings.xml file not found in HIVE_HOME 
or 
HIVE_CONF_DIR,/home/notroot/lab/software/apache-hive-1.2.1-bin/conf/ivysettings.xml
 will be used
 INFO [2017-02-20 12:06:23,663] ({pool-2-thread-4} 
HiveMetaStore.java[newRawStore]:589) - 0: Opening raw store with implemenation 
class:org.apache.hadoop.hive.metastore.ObjectStore
 INFO [2017-02-20 12:06:23,707] ({pool-2-thread-4} 
ObjectStore.java[initialize]:289) - ObjectStore, initialize called
 WARN [2017-02-20 12:06:23,723] ({pool-2-thread-4} 
HiveMetaStore.java[createDefaultDB]:622) - Retrying creating default database 
after error: Class org.datanucleus.api.jdo.JDOPersistenceManagerFactory was not 
found.
javax.jdo.JDOFatalUserException: Class 
org.datanucleus.api.jdo.JDOPersistenceManagerFactory was not found.
        at 
javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1175)
        at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
        at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
        at 
org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
        at 
org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
        at 
org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
        at 
org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
        at 
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
        at 
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
        at 
org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
        at 
org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
        at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
        at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
        at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
        at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
        at 
org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
        at 
org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
        at 
org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
        at 
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at 
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
        at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
        at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
        at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
        at 
org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
        at 
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
        at 
org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:194)
        at 
org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:238)
        at 
org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:218)
        at 
org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:208)
        at 
org.apache.spark.sql.hive.HiveContext.functionRegistry$lzycompute(HiveContext.scala:462)
        at 
org.apache.spark.sql.hive.HiveContext.functionRegistry(HiveContext.scala:461)
        at org.apache.spark.sql.UDFRegistration.<init>(UDFRegistration.scala:40)
        at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:330)
        at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:90)
        at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
        at 
$line142600613938.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:37)
        at 
$line142600613938.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:43)
        at 
$line142600613938.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:45)
        at 
$line142600613938.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:47)
        at 
$line142600613938.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:49)
        at 
$line142600613938.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:51)
        at 
$line142600613938.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:53)
        at $line142600613938.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:55)
        at $line142600613938.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:57)
        at $line142600613938.$read$$iwC$$iwC$$iwC.<init>(<console>:59)
        at $line142600613938.$read$$iwC$$iwC.<init>(<console>:61)
        at $line142600613938.$read$$iwC.<init>(<console>:63)
        at $line142600613938.$read.<init>(<console>:65)
        at $line142600613938.$read$.<init>(<console>:69)
        at $line142600613938.$read$.<clinit>(<console>)
        at $line142600613938.$eval$.<init>(<console>:7)
        at $line142600613938.$eval$.<clinit>(<console>)
        at $line142600613938.$eval.$print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
        at 
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
        at 
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:38)
        at 
org.apache.zeppelin.spark.SparkInterpreter.interpret(SparkInterpreter.java:961)
        at 
org.apache.zeppelin.spark.SparkInterpreter.interpretInput(SparkInterpreter.java:1187)
        at 
org.apache.zeppelin.spark.SparkInterpreter.interpret(SparkInterpreter.java:1133)
        at 
org.apache.zeppelin.spark.SparkInterpreter.interpret(SparkInterpreter.java:1126)
        at 
org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:94)
        at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:489)
        at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
        at 
org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
        at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
NestedThrowablesStackTrace:
java.lang.ClassNotFoundException: 
org.datanucleus.api.jdo.JDOPersistenceManagerFactory
        at 
scala.tools.nsc.interpreter.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:83)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:278)
        at javax.jdo.JDOHelper$18.run(JDOHelper.java:2018)
        at javax.jdo.JDOHelper$18.run(JDOHelper.java:2016)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.jdo.JDOHelper.forName(JDOHelper.java:2015)
        at 
javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1162)
        at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
        at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
        at 
org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
        at 
org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
        at 
org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
        at 
org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
        at 
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
        at 
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
        at 
org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
        at 
org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
        at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
        at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
        at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
        at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
        at 
org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
        at 
org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
        at 
org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
        at 
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at 
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
        at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
        at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
        at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
        at 
org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
        at 
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
        at 
org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:194)
        at 
org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:238)
        at 
org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:218)
        at 
org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:208)
        at 
org.apache.spark.sql.hive.HiveContext.functionRegistry$lzycompute(HiveContext.scala:462)
        at 
org.apache.spark.sql.hive.HiveContext.functionRegistry(HiveContext.scala:461)
        at org.apache.spark.sql.UDFRegistration.<init>(UDFRegistration.scala:40)
        at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:330)
        at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:90)
        at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
        at 
$line142600613938.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:37)
        at 
$line142600613938.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:43)
        at 
$line142600613938.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:45)
        at 
$line142600613938.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:47)
        at 
$line142600613938.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:49)
        at 
$line142600613938.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:51)
        at 
$line142600613938.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:53)
        at $line142600613938.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:55)
        at $line142600613938.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:57)
        at $line142600613938.$read$$iwC$$iwC$$iwC.<init>(<console>:59)
        at $line142600613938.$read$$iwC$$iwC.<init>(<console>:61)
        at $line142600613938.$read$$iwC.<init>(<console>:63)
        at $line142600613938.$read.<init>(<console>:65)
        at $line142600613938.$read$.<init>(<console>:69)
        at $line142600613938.$read$.<clinit>(<console>)
        at $line142600613938.$eval$.<init>(<console>:7)
        at $line142600613938.$eval$.<clinit>(<console>)
        at $line142600613938.$eval.$print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
        at 
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
        at 
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:38)
        at 
org.apache.zeppelin.spark.SparkInterpreter.interpret(SparkInterpreter.java:961)
        at 
org.apache.zeppelin.spark.SparkInterpreter.interpretInput(SparkInterpreter.java:1187)
        at 
org.apache.zeppelin.spark.SparkInterpreter.interpret(SparkInterpreter.java:1133)
        at 
org.apache.zeppelin.spark.SparkInterpreter.interpret(SparkInterpreter.java:1126)
        at 
org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:94)
        at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:489)
        at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
        at 
org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
        at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
 INFO [2017-02-20 12:06:23,735] ({pool-2-thread-4} 
HiveMetaStore.java[newRawStore]:589) - 0: Opening raw store with implemenation 
class:org.apache.hadoop.hive.metastore.ObjectStore
 INFO [2017-02-20 12:06:23,809] ({pool-2-thread-4} 
ObjectStore.java[initialize]:289) - ObjectStore, initialize called
 INFO [2017-02-20 12:06:27,548] ({pool-2-thread-4} 
SchedulerFactory.java[jobFinished]:137) - Job remoteInterpretJob_1487592379922 
finished by scheduler org.apache.zeppelin.spark.SparkInterpreter1426006139




--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Reply via email to