As suggested in the error messages, double-check your class path.


From: CharlieLin <chury...@gmail.com<mailto:chury...@gmail.com>>
Date: Tuesday, August 26, 2014 at 8:29 PM
To: "user@spark.apache.org<mailto:user@spark.apache.org>" 
<user@spark.apache.org<mailto:user@spark.apache.org>>
Subject: Execute HiveFormSpark ERROR.

hi, all :
    I tried to use Spark SQL on spark-shell, as the spark-example.
When I execute :

val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
import hiveContext._
hql("CREATE TABLE IF NOT EXISTS src (key INT, value STRING)")

then report error like below:

scala> hiveContext.hql("CREATE TABLE IF NOT EXISTS src (key INT, value STRING)")
14/08/27 11:08:19 INFO ParseDriver: Parsing command: CREATE TABLE IF NOT EXISTS 
src (key INT, value STRING)
14/08/27 11:08:19 INFO ParseDriver: Parse Completed
14/08/27 11:08:19 INFO Analyzer: Max iterations (2) reached for batch 
MultiInstanceRelations
14/08/27 11:08:19 INFO Analyzer: Max iterations (2) reached for batch 
CaseInsensitiveAttributeReferences
14/08/27 11:08:19 INFO Analyzer: Max iterations (2) reached for batch Check 
Analysis
14/08/27 11:08:19 INFO SQLContext$$anon$1: Max iterations (2) reached for batch 
Add exchange
14/08/27 11:08:19 INFO SQLContext$$anon$1: Max iterations (2) reached for batch 
Prepare Expressions
14/08/27 11:08:19 INFO Driver: <PERFLOG method=Driver.run>
14/08/27 11:08:19 INFO Driver: <PERFLOG method=TimeToSubmit>
14/08/27 11:08:19 INFO Driver: <PERFLOG method=compile>
14/08/27 11:08:19 INFO Driver: <PERFLOG method=parse>
14/08/27 11:08:19 INFO ParseDriver: Parsing command: CREATE TABLE IF NOT EXISTS 
src (key INT, value STRING)
14/08/27 11:08:19 INFO ParseDriver: Parse Completed
14/08/27 11:08:19 INFO Driver: </PERFLOG method=parse start=1409108899822 
end=1409108899822 duration=0>
14/08/27 11:08:19 INFO Driver: <PERFLOG method=semanticAnalyze>
14/08/27 11:08:19 INFO SemanticAnalyzer: Starting Semantic Analysis
14/08/27 11:08:19 INFO SemanticAnalyzer: Creating table src position=27
14/08/27 11:08:19 INFO HiveMetaStore: 0: Opening raw store with implemenation 
class:org.apache.hadoop.hive.metastore.ObjectStore
14/08/27 11:08:19 INFO ObjectStore: ObjectStore, initialize called
14/08/27 11:08:20 WARN General: Plugin (Bundle) "org.datanucleus" is already 
registered. Ensure you dont have multiple JAR versions of the same plugin in 
the classpath. The URL 
"file:/home/spark/spark/lib_managed/jars/datanucleus-core-3.2.2.jar"<file:/home/spark/spark/lib_managed/jars/datanucleus-core-3.2.2.jar>
 is already registered, and you are trying to register an identical plugin 
located at URL 
"file:/home/spark/spark-1.0.2-2.0.0-mr1-cdh-4.2.1/lib_managed/jars/datanucleus-core-3.2.2.jar."<file:/home/spark/spark-1.0.2-2.0.0-mr1-cdh-4.2.1/lib_managed/jars/datanucleus-core-3.2.2.jar.>
14/08/27 11:08:20 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" 
is already registered. Ensure you dont have multiple JAR versions of the same 
plugin in the classpath. The URL 
"file:/home/spark/spark-1.0.2-2.0.0-mr1-cdh-4.2.1/lib_managed/jars/datanucleus-rdbms-3.2.1.jar"<file:/home/spark/spark-1.0.2-2.0.0-mr1-cdh-4.2.1/lib_managed/jars/datanucleus-rdbms-3.2.1.jar>
 is already registered, and you are trying to register an identical plugin 
located at URL 
"file:/home/spark/spark/lib_managed/jars/datanucleus-rdbms-3.2.1.jar."<file:/home/spark/spark/lib_managed/jars/datanucleus-rdbms-3.2.1.jar.>
14/08/27 11:08:20 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is 
already registered. Ensure you dont have multiple JAR versions of the same 
plugin in the classpath. The URL 
"file:/home/spark/spark/lib_managed/jars/datanucleus-api-jdo-3.2.1.jar"<file:/home/spark/spark/lib_managed/jars/datanucleus-api-jdo-3.2.1.jar>
 is already registered, and you are trying to register an identical plugin 
located at URL 
"file:/home/spark/spark-1.0.2-2.0.0-mr1-cdh-4.2.1/lib_managed/jars/datanucleus-api-jdo-3.2.1.jar."<file:/home/spark/spark-1.0.2-2.0.0-mr1-cdh-4.2.1/lib_managed/jars/datanucleus-api-jdo-3.2.1.jar.>
14/08/27 11:08:20 INFO Persistence: Property datanucleus.cache.level2 unknown - 
will be ignored
org.apache.hadoop.hive.ql.metadata.HiveException: Unable to fetch table src
    at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:958)
    at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:905)
    at 
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeCreateTable(SemanticAnalyzer.java:8999)
    at 
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:8313)
    at 
org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:284)
    at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:441)
    at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:342)
    at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:977)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:888)
    at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:189)
    at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:163)
    at 
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
    at 
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
    at 
org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:38)
    at 
org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd$lzycompute(HiveContext.scala:250)
    at 
org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd(HiveContext.scala:250)
    at org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
    at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:104)
    at org.apache.spark.sql.hive.HiveContext.hiveql(HiveContext.scala:75)
    at org.apache.spark.sql.hive.HiveContext.hql(HiveContext.scala:78)
    at $line13.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:16)
    at $line13.$read$$iwC$$iwC$$iwC.<init>(<console>:21)
    at $line13.$read$$iwC$$iwC.<init>(<console>:23)
    at $line13.$read$$iwC.<init>(<console>:25)
    at $line13.$read.<init>(<console>:27)
    at $line13.$read$.<init>(<console>:31)
    at $line13.$read$.<clinit>(<console>)
    at $line13.$eval$.<init>(<console>:7)
    at $line13.$eval$.<clinit>(<console>)
    at $line13.$eval.$print(<console>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:788)
    at 
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1056)
    at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:614)
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:645)
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:609)
    at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:796)
    at 
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:841)
    at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:753)
    at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:601)
    at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:608)
    at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:611)
    at 
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:936)
    at 
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
    at 
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
    at 
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
    at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:884)
    at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:982)
    at org.apache.spark.repl.Main$.main(Main.scala:31)
    at org.apache.spark.repl.Main.main(Main.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:303)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.RuntimeException: Unable to instantiate 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient
    at 
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1212)
    at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)
    at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
    at 
org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2372)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2383)
    at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:950)
    ... 59 more
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
    at 
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1210)
    ... 64 more
Caused by: javax.jdo.JDOFatalInternalException: Error creating transactional 
connection factory
NestedThrowables:
java.lang.reflect.InvocationTargetException
    at 
org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:587)
    at 
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:781)
    at 
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:326)
    at 
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:195)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
    at 
javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
    at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:275)
    at 
org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:304)
    at 
org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:234)
    at 
org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:209)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:70)
    at 
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
    at 
org.apache.hadoop.hive.metastore.RetryingRawStore.<init>(RetryingRawStore.java:64)
    at 
org.apache.hadoop.hive.metastore.RetryingRawStore.getProxy(RetryingRawStore.java:73)
    at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:415)
    at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:402)
    at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:441)
    at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:326)
    at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:286)
    at 
org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54)
    at 
org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)
    at 
org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4060)
    at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:121)
    ... 69 more
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
    at 
org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
    at 
org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:325)
    at 
org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(AbstractStoreManager.java:281)
    at 
org.datanucleus.store.AbstractStoreManager.<init>(AbstractStoreManager.java:239)
    at 
org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:292)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
    at 
org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
    at 
org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
    at 
org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1069)
    at org.datanucleus.NucleusContext.initialise(NucleusContext.java:359)
    at 
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:768)
    ... 98 more
Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke the 
"BONECP" plugin to create a ConnectionPool gave an error : The specified 
datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. 
Please check your CLASSPATH specification, and the name of the driver.
    at 
org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:237)
    at 
org.datanucleus.store.rdbms.ConnectionFactoryImpl.initialiseDataSources(ConnectionFactoryImpl.java:110)
    at 
org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryImpl.java:82)
    ... 116 more
Caused by: 
org.datanucleus.store.rdbms.datasource.DatastoreDriverNotFoundException: The 
specified datastore driver ("com.mysql.jdbc.Driver") was not found in the 
CLASSPATH. Please check your CLASSPATH specification, and the name of the 
driver.
    at 
org.datanucleus.store.rdbms.datasource.AbstractDataSourceFactory.loadDriver(AbstractDataSourceFactory.java:58)
    at 
org.datanucleus.store.rdbms.datasource.BoneCPDataSourceFactory.makePooledDataSource(BoneCPDataSourceFactory.java:61)
    at 
org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:217)
    ... 118 more
14/08/27 11:08:20 INFO Driver: Semantic Analysis Completed
14/08/27 11:08:20 INFO Driver: </PERFLOG method=semanticAnalyze 
start=1409108899823 end=1409108900368 duration=545>
14/08/27 11:08:20 INFO Driver: Returning Hive schema: Schema(fieldSchemas:null, 
properties:null)
14/08/27 11:08:20 INFO Driver: <PERFLOG method=doAuthorization>
14/08/27 11:08:20 INFO HiveMetaStore: 0: Opening raw store with implemenation 
class:org.apache.hadoop.hive.metastore.ObjectStore
14/08/27 11:08:20 INFO ObjectStore: ObjectStore, initialize called
14/08/27 11:08:20 WARN General: Plugin (Bundle) "org.datanucleus" is already 
registered. Ensure you dont have multiple JAR versions of the same plugin in 
the classpath. The URL 
"file:/home/spark/spark/lib_managed/jars/datanucleus-core-3.2.2.jar"<file:/home/spark/spark/lib_managed/jars/datanucleus-core-3.2.2.jar>
 is already registered, and you are trying to register an identical plugin 
located at URL 
"file:/home/spark/spark-1.0.2-2.0.0-mr1-cdh-4.2.1/lib_managed/jars/datanucleus-core-3.2.2.jar."<file:/home/spark/spark-1.0.2-2.0.0-mr1-cdh-4.2.1/lib_managed/jars/datanucleus-core-3.2.2.jar.>
14/08/27 11:08:20 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" 
is already registered. Ensure you dont have multiple JAR versions of the same 
plugin in the classpath. The URL 
"file:/home/spark/spark-1.0.2-2.0.0-mr1-cdh-4.2.1/lib_managed/jars/datanucleus-rdbms-3.2.1.jar"<file:/home/spark/spark-1.0.2-2.0.0-mr1-cdh-4.2.1/lib_managed/jars/datanucleus-rdbms-3.2.1.jar>
 is already registered, and you are trying to register an identical plugin 
located at URL 
"file:/home/spark/spark/lib_managed/jars/datanucleus-rdbms-3.2.1.jar."<file:/home/spark/spark/lib_managed/jars/datanucleus-rdbms-3.2.1.jar.>
14/08/27 11:08:20 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is 
already registered. Ensure you dont have multiple JAR versions of the same 
plugin in the classpath. The URL 
"file:/home/spark/spark/lib_managed/jars/datanucleus-api-jdo-3.2.1.jar"<file:/home/spark/spark/lib_managed/jars/datanucleus-api-jdo-3.2.1.jar>
 is already registered, and you are trying to register an identical plugin 
located at URL 
"file:/home/spark/spark-1.0.2-2.0.0-mr1-cdh-4.2.1/lib_managed/jars/datanucleus-api-jdo-3.2.1.jar."<file:/home/spark/spark-1.0.2-2.0.0-mr1-cdh-4.2.1/lib_managed/jars/datanucleus-api-jdo-3.2.1.jar.>
14/08/27 11:08:20 INFO Persistence: Property datanucleus.cache.level2 unknown - 
will be ignored
14/08/27 11:08:20 INFO Driver: </PERFLOG method=doAuthorization 
start=1409108900373 end=1409108900480 duration=107>
14/08/27 11:08:20 ERROR Driver: FAILED: HiveException 
java.lang.RuntimeException: Unable to instantiate 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient
org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: 
Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
    at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1143)
    at org.apache.hadoop.hive.ql.Driver.doAuthorization(Driver.java:542)
    at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:493)
    at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:342)
    at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:977)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:888)
    at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:189)
    at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:163)
    at 
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
    at 
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
    at 
org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:38)
    at 
org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd$lzycompute(HiveContext.scala:250)
    at 
org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd(HiveContext.scala:250)
    at org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
    at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:104)
    at org.apache.spark.sql.hive.HiveContext.hiveql(HiveContext.scala:75)
    at org.apache.spark.sql.hive.HiveContext.hql(HiveContext.scala:78)
    at $line13.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:16)
    at $line13.$read$$iwC$$iwC$$iwC.<init>(<console>:21)
    at $line13.$read$$iwC$$iwC.<init>(<console>:23)
    at $line13.$read$$iwC.<init>(<console>:25)
    at $line13.$read.<init>(<console>:27)
    at $line13.$read$.<init>(<console>:31)
    at $line13.$read$.<clinit>(<console>)
    at $line13.$eval$.<init>(<console>:7)
    at $line13.$eval$.<clinit>(<console>)
    at $line13.$eval.$print(<console>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:788)
    at 
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1056)
    at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:614)
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:645)
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:609)
    at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:796)
    at 
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:841)
    at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:753)
    at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:601)
    at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:608)
    at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:611)
    at 
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:936)
    at 
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
    at 
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
    at 
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
    at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:884)
    at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:982)
    at org.apache.spark.repl.Main$.main(Main.scala:31)
    at org.apache.spark.repl.Main.main(Main.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:303)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.RuntimeException: Unable to instantiate 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient
    at 
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1212)
    at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)
    at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
    at 
org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2372)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2383)
    at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1139)
    ... 56 more
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
    at 
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1210)
    ... 61 more
Caused by: javax.jdo.JDOFatalInternalException: Error creating transactional 
connection factory
NestedThrowables:
java.lang.reflect.InvocationTargetException
    at 
org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:587)
    at 
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:781)
    at 
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:326)
    at 
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:195)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
    at 
javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
    at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
    at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:275)
    at 
org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:304)
    at 
org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:234)
    at 
org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:209)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:70)
    at 
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
    at 
org.apache.hadoop.hive.metastore.RetryingRawStore.<init>(RetryingRawStore.java:64)
    at 
org.apache.hadoop.hive.metastore.RetryingRawStore.getProxy(RetryingRawStore.java:73)
    at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:415)
    at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:402)
    at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:441)
    at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:326)
    at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:286)
    at 
org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54)
    at 
org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)
    at 
org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4060)
    at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:121)
    ... 66 more
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
    at 
org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
    at 
org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:325)
    at 
org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(AbstractStoreManager.java:281)
    at 
org.datanucleus.store.AbstractStoreManager.<init>(AbstractStoreManager.java:239)
    at 
org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:292)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
    at 
org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
    at 
org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
    at 
org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1069)
    at org.datanucleus.NucleusContext.initialise(NucleusContext.java:359)
    at 
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:768)
    ... 95 more
Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke the 
"BONECP" plugin to create a ConnectionPool gave an error : The specified 
datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. 
Please check your CLASSPATH specification, and the name of the driver.
    at 
org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:237)
    at 
org.datanucleus.store.rdbms.ConnectionFactoryImpl.initialiseDataSources(ConnectionFactoryImpl.java:110)
    at 
org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryImpl.java:82)
    ... 113 more
Caused by: 
org.datanucleus.store.rdbms.datasource.DatastoreDriverNotFoundException: The 
specified datastore driver ("com.mysql.jdbc.Driver") was not found in the 
CLASSPATH. Please check your CLASSPATH specification, and the name of the 
driver.
    at 
org.datanucleus.store.rdbms.datasource.AbstractDataSourceFactory.loadDriver(AbstractDataSourceFactory.java:58)
    at 
org.datanucleus.store.rdbms.datasource.BoneCPDataSourceFactory.makePooledDataSource(BoneCPDataSourceFactory.java:61)
    at 
org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:217)
    ... 115 more

14/08/27 11:08:20 INFO Driver: </PERFLOG method=compile start=1409108899798 
end=1409108900487 duration=689>
14/08/27 11:08:20 INFO Driver: <PERFLOG method=releaseLocks>
14/08/27 11:08:20 INFO Driver: </PERFLOG method=releaseLocks 
start=1409108900487 end=1409108900487 duration=0>
14/08/27 11:08:20 INFO Driver: <PERFLOG method=releaseLocks>
14/08/27 11:08:20 INFO Driver: </PERFLOG method=releaseLocks 
start=1409108900490 end=1409108900490 duration=0>
14/08/27 11:08:20 ERROR LocalHiveContext:
======================
HIVE FAILURE OUTPUT
======================
SET 
javax.jdo.option.ConnectionURL=jdbc:derby:;databaseName=/home/linqili/metastore;create=true
SET hive.metastore.warehouse.dir=/home/linqili/warehouse
FAILED: HiveException java.lang.RuntimeException: Unable to instantiate 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient

======================
END HIVE FAILURE OUTPUT
======================

org.apache.spark.sql.execution.QueryExecutionException: FAILED: HiveException 
java.lang.RuntimeException: Unable to instantiate 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient
    at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:193)
    at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:163)
    at 
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
    at 
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
    at 
org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:38)
    at 
org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd$lzycompute(HiveContext.scala:250)
    at 
org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd(HiveContext.scala:250)
    at org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
    at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:104)
    at org.apache.spark.sql.hive.HiveContext.hiveql(HiveContext.scala:75)
    at org.apache.spark.sql.hive.HiveContext.hql(HiveContext.scala:78)
    at $iwC$$iwC$$iwC$$iwC.<init>(<console>:16)
    at $iwC$$iwC$$iwC.<init>(<console>:21)
    at $iwC$$iwC.<init>(<console>:23)
    at $iwC.<init>(<console>:25)
    at <init>(<console>:27)
    at .<init>(<console>:31)
    at .<clinit>(<console>)
    at .<init>(<console>:7)
    at .<clinit>(<console>)
    at $print(<console>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:788)
    at 
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1056)
    at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:614)
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:645)
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:609)
    at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:796)
    at 
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:841)
    at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:753)
    at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:601)
    at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:608)
    at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:611)
    at 
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:936)
    at 
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
    at 
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
    at 
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
    at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:884)
    at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:982)
    at org.apache.spark.repl.Main$.main(Main.scala:31)
    at org.apache.spark.repl.Main.main(Main.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:303)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Reply via email to