This is not a bug of Hive. Spark uses hive-site.xml to get the location of
Hive metastore.

You cannot connect directly to Hive metastore and interrogate metastore
directly. You will need to know the metastore schema.

  <property>
    <name>hive.metastore.uris</name>
    <value>thrift://<hostname>:9083</value>
    <description>Thrift URI for the remote metastore. Used by metastore
client to connect to remote metastore.</description>
  </property>

That port 9083 (default port) requires up and running via

$HIVE_HOME/bin/hive --service metastore &

before any connection to Hive metstore can be established

Login           OS Proc/ID           Client Proc/ID       SID   SER#
HOST       PROGRAM                        Logged/Hours
--------------- -------------------- -------------------- ----- -----
---------- ------------------------------ ------------
HIVEUSER        oracle/11971         hduser/1234          245   30286
rhes564    JDBC Thin Client                          1
HIVEUSER        oracle/11973         hduser/1234          284   14712
rhes564    JDBC Thin Client                          1
HIVEUSER        oracle/15394         hduser/1234          167   7586
rhes564    JDBC Thin Client                         70
HIVEUSER        oracle/15396         hduser/1234          208   15532
rhes564    JDBC Thin Client                         70
HIVEUSER        oracle/15418         hduser/1234          246   20955
rhes564    JDBC Thin Client                         70
HIVEUSER        oracle/15423         hduser/1234          286   19165
rhes564    JDBC Thin Client                         70
HIVEUSER        oracle/15427         hduser/1234          325   44799
rhes564    JDBC Thin Client                         70
HIVEUSER        oracle/15429         hduser/1234          369   20697
rhes564    JDBC Thin Client                         70
HIVEUSER        oracle/15431         hduser/1234          408   39060
rhes564    JDBC Thin Client                         70
HIVEUSER        oracle/15433         hduser/1234          446   10661
rhes564    JDBC Thin Client                         70
HIVEUSER        oracle/15689         hduser/1234          7     6105
rhes564    JDBC Thin Client                         70
HIVEUSER        oracle/15691         hduser/1234          49    14162
rhes564    JDBC Thin Client                         70

HTH

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com



On 19 April 2016 at 06:53, Sea <261810...@qq.com> wrote:

> It's a bug of hive. Please use hive metastore service instead of visiting
> mysql directly.
> set hive.metastore.uris in hive-site.xml
>
>
>
> ------------------ 原始邮件 ------------------
> *发件人:* "Jieliang Li";<ljl1988...@126.com>;
> *发送时间:* 2016年4月19日(星期二) 中午12:55
> *收件人:* "user"<user@spark.apache.org>;
> *主题:* spark sql on hive
>
> hi everyone.i use spark sql, but throw an exception:
> Retrying creating default database after error: Error creating
> transactional connection factory
> javax.jdo.JDOFatalInternalException: Error creating transactional
> connection factory
> at
> org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:587)
> at
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)
> at
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
> at
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
> at
> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
> at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
> at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
> at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
> at
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
> at
> org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:56)
> at
> org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:65)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:579)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:557)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:606)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:448)
> at
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
> at
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5601)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:193)
> at
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1486)
> at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:64)
> at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:74)
> at
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2841)
> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2860)
> at
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:453)
> at
> org.apache.spark.sql.hive.HiveContext.sessionState$lzycompute(HiveContext.scala:229)
> at
> org.apache.spark.sql.hive.HiveContext.sessionState(HiveContext.scala:225)
> at
> org.apache.spark.sql.hive.HiveContext$QueryExecution.<init>(HiveContext.scala:373)
> at org.apache.spark.sql.hive.HiveContext.executePlan(HiveContext.scala:80)
> at org.apache.spark.sql.hive.HiveContext.executePlan(HiveContext.scala:49)
> at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:131)
> at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)
> at org.apache.spark.sql.SQLContext.load(SQLContext.scala:680)
> at org.apache.spark.sql.SQLContext.load(SQLContext.scala:667)
> at
> com.chinamobile.cmss.pdm.algo.ml.sparkbased.statistics.Correlation$$anonfun$2$$anonfun$apply$2$$anonfun$apply$3$$anonfun$apply$4.apply(CorrelationPipe.scala:46)
> at
> com.chinamobile.cmss.pdm.algo.ml.sparkbased.statistics.Correlation$$anonfun$2$$anonfun$apply$2$$anonfun$apply$3$$anonfun$apply$4.apply(CorrelationPipe.scala:38)
> at com.chinamobile.cmss.pdm.common.yavf.Lazy.expr$lzycompute(Lazy.scala:12)
> at com.chinamobile.cmss.pdm.common.yavf.Lazy.expr(Lazy.scala:12)
> at com.chinamobile.cmss.pdm.common.yavf.Lazy.unary_$bang(Lazy.scala:14)
> at
> com.chinamobile.cmss.pdm.algo.AlgorithmRunner$class.execute(AlgorithmRunner.scala:28)
> at
> com.chinamobile.cmss.pdm.algo.ml.sparkbased.statistics.Correlation.execute(CorrelationPipe.scala:32)
> at
> com.chinamobile.cmss.pdm.algo.ml.sparkbased.statistics.Correlation$.main(CorrelationPipe.scala:60)
> at
> com.chinamobile.cmss.pdm.algo.ml.sparkbased.statistics.Correlation.main(CorrelationPipe.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:480)
> NestedThrowablesStackTrace:
> java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at
> org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
> at
> org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:325)
> at
> org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(AbstractStoreManager.java:282)
> at
> org.datanucleus.store.AbstractStoreManager.<init>(AbstractStoreManager.java:240)
> at
> org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:286)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at
> org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
> at
> org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
> at
> org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
> at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
> at
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
> at
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
> at
> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
> at
> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
> at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
> at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
> at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
> at
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
> at
> org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:56)
> at
> org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:65)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:579)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:557)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:606)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:448)
> at
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
> at
> org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5601)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:193)
> at
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1486)
> at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:64)
> at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:74)
> at
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2841)
> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2860)
> at
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:453)
> at
> org.apache.spark.sql.hive.HiveContext.sessionState$lzycompute(HiveContext.scala:229)
> at
> org.apache.spark.sql.hive.HiveContext.sessionState(HiveContext.scala:225)
> at
> org.apache.spark.sql.hive.HiveContext$QueryExecution.<init>(HiveContext.scala:373)
> at org.apache.spark.sql.hive.HiveContext.executePlan(HiveContext.scala:80)
> at org.apache.spark.sql.hive.HiveContext.executePlan(HiveContext.scala:49)
> at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:131)
> at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)
> at org.apache.spark.sql.SQLContext.load(SQLContext.scala:680)
> at org.apache.spark.sql.SQLContext.load(SQLContext.scala:667)
> at
> com.chinamobile.cmss.pdm.algo.ml.sparkbased.statistics.Correlation$$anonfun$2$$anonfun$apply$2$$anonfun$apply$3$$anonfun$apply$4.apply(CorrelationPipe.scala:46)
> at
> com.chinamobile.cmss.pdm.algo.ml.sparkbased.statistics.Correlation$$anonfun$2$$anonfun$apply$2$$anonfun$apply$3$$anonfun$apply$4.apply(CorrelationPipe.scala:38)
> at com.chinamobile.cmss.pdm.common.yavf.Lazy.expr$lzycompute(Lazy.scala:12)
> at com.chinamobile.cmss.pdm.common.yavf.Lazy.expr(Lazy.scala:12)
> at com.chinamobile.cmss.pdm.common.yavf.Lazy.unary_$bang(Lazy.scala:14)
> at
> com.chinamobile.cmss.pdm.algo.AlgorithmRunner$class.execute(AlgorithmRunner.scala:28)
> at
> com.chinamobile.cmss.pdm.algo.ml.sparkbased.statistics.Correlation.execute(CorrelationPipe.scala:32)
> at
> com.chinamobile.cmss.pdm.algo.ml.sparkbased.statistics.Correlation$.main(CorrelationPipe.scala:60)
> at
> com.chinamobile.cmss.pdm.algo.ml.sparkbased.statistics.Correlation.main(CorrelationPipe.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:480)
> Caused by: java.lang.ExceptionInInitializerError
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at java.lang.Class.newInstance(Class.java:383)
> at
> org.datanucleus.store.rdbms.connectionpool.AbstractConnectionPoolFactory.loadDriver(AbstractConnectionPoolFactory.java:47)
> at
> org.datanucleus.store.rdbms.connectionpool.BoneCPConnectionPoolFactory.createConnectionPool(BoneCPConnectionPoolFactory.java:54)
> at
> org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:238)
> at
> org.datanucleus.store.rdbms.ConnectionFactoryImpl.initialiseDataSources(ConnectionFactoryImpl.java:131)
> at
> org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryImpl.java:85)
> ... 80 more
> Caused by: java.lang.SecurityException: sealing violation: can't seal
> package org.apache.derby.impl.services.locks: already loaded
> at java.net.URLClassLoader.getAndVerifyPackage(URLClassLoader.java:395)
> at java.net.URLClassLoader.defineClass(URLClassLoader.java:417)
> at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> at java.lang.ClassLoader.defineClass1(Native Method)
> at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
> at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
> at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
> at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:195)
> at
> org.apache.derby.impl.services.monitor.BaseMonitor.getImplementations(Unknown
> Source)
> at
> org.apache.derby.impl.services.monitor.BaseMonitor.getDefaultImplementations(Unknown
> Source)
> at org.apache.derby.impl.services.monitor.BaseMonitor.runWithState(Unknown
> Source)
> at org.apache.derby.impl.services.monitor.FileMonitor.<init>(Unknown
> Source)
> at org.apache.derby.iapi.services.monitor.Monitor.startMonitor(Unknown
> Source)
> at org.apache.derby.iapi.jdbc.JDBCBoot.boot(Unknown Source)
> at org.apache.derby.jdbc.EmbeddedDriver.boot(Unknown Source)
> at org.apache.derby.jdbc.EmbeddedDriver.<clinit>(Unknown Source)
> ... 90 more
>
>
>
>

Reply via email to