Hi,

When I try to use HiveContext in Spark shell on AWS, I got the error
"java.lang.IllegalAccessError: tried to access method
com.google.common.collect.MapMaker.makeComputingMap(Lcom/google/common/base/Function;)Ljava/util/concurrent/ConcurrentMap".

I follow the steps below to compile and install Spark(ps. I test 1.0.0,
1.0.1 and 1.0.2).

Step 1:
./make-distribution.sh --hadoop 2.4.0 --with-hive --tgz
Success!

Step 2:
elastic-mapreduce --create --alive --name "Spark Test"  --ami-version 3.1.0
--instance-type m3.xlarge --instance-count 2
Hadoop version: 2.4.0
Hive: 0.11.0

Success !

3.
wget --no-check-certificate
https://s3.amazonaws.com/spark-related-packages/scala-2.10.3.tgz


4. install and configure Hive, Spark & Scala

# edit hive-site.xml
add account and passport for Amazon RDS to retrive remote metadata of hive.

Successfully connect to RDS!

# edit bashrc
vim /home/hadoop/.bashrc
export SCALA_HOME=/home/hadoop/.versions/scala-2.10.3


# create_spark_env
vim /home/hadoop/spark/conf/spark-env.sh
export SPARK_MASTER_IP=10.218.180.250
export SCALA_HOME=/home/hadoop/.versions/scala-2.10.3
export SPARK_LOCAL_DIRS=/mnt/spark/
export
SPARK_CLASSPATH="/usr/share/aws/emr/emr-fs/lib/*:/usr/share/aws/emr/lib/*:/home/hadoop/share/hadoop/common/lib/*:/home/hadoop/.versions/2.4.0/share/hadoop/common/lib/hadoop-lzo.jar"
export SPARK_DAEMON_JAVA_OPTS="-verbose:gc -XX:+PrintGCDetails
-XX:+PrintGCTimeStamps"

# copy core site to spark and shark
cp /home/hadoop/conf/core-site.xml /home/hadoop/spark/conf/



5.Start spark
/home/hadoop/spark/sbin/start-master.sh

spark can read and write data in Amazon S3.

6. ./spark/bin/spark-shell --master spark://10.218.180.250:7077
--driver-class-path spark/lib/mysql-connector-java-5.1.26-bin.jar

7. error log

scala> val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
14/08/07 09:38:39 INFO Configuration.deprecation:
mapred.input.dir.recursive is deprecated. Instead, use
mapreduce.input.fileinputformat.input.dir.recursive
14/08/07 09:38:39 INFO Configuration.deprecation: mapred.max.split.size is
deprecated. Instead, use mapreduce.input.fileinputformat.split.maxsize
14/08/07 09:38:39 INFO Configuration.deprecation: mapred.min.split.size is
deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize
14/08/07 09:38:39 INFO Configuration.deprecation:
mapred.min.split.size.per.rack is deprecated. Instead, use
mapreduce.input.fileinputformat.split.minsize.per.rack
14/08/07 09:38:39 INFO Configuration.deprecation:
mapred.min.split.size.per.node is deprecated. Instead, use
mapreduce.input.fileinputformat.split.minsize.per.node
14/08/07 09:38:39 INFO Configuration.deprecation: mapred.reduce.tasks is
deprecated. Instead, use mapreduce.job.reduces
14/08/07 09:38:39 INFO Configuration.deprecation:
mapred.reduce.tasks.speculative.execution is deprecated. Instead, use
mapreduce.reduce.speculative
hiveContext: org.apache.spark.sql.hive.HiveContext =
org.apache.spark.sql.hive.HiveContext@45be296f

scala> import hiveContext._
import hiveContext._

scala> hql("show tables")
14/08/07 09:38:48 INFO parse.ParseDriver: Parsing command: show tables
14/08/07 09:38:48 INFO parse.ParseDriver: Parse Completed
14/08/07 09:38:48 INFO analysis.Analyzer: Max iterations (2) reached for
batch MultiInstanceRelations
14/08/07 09:38:48 INFO analysis.Analyzer: Max iterations (2) reached for
batch CaseInsensitiveAttributeReferences
14/08/07 09:38:48 INFO analysis.Analyzer: Max iterations (2) reached for
batch Check Analysis
14/08/07 09:38:48 INFO sql.SQLContext$$anon$1: Max iterations (2) reached
for batch Add exchange
14/08/07 09:38:48 INFO sql.SQLContext$$anon$1: Max iterations (2) reached
for batch Prepare Expressions
14/08/07 09:38:49 INFO Configuration.deprecation:
mapred.input.dir.recursive is deprecated. Instead, use
mapreduce.input.fileinputformat.input.dir.recursive
14/08/07 09:38:49 INFO ql.Driver: <PERFLOG method=Driver.run>
14/08/07 09:38:49 INFO ql.Driver: <PERFLOG method=TimeToSubmit>
14/08/07 09:38:49 INFO ql.Driver: <PERFLOG method=compile>
14/08/07 09:38:49 INFO ql.Driver: <PERFLOG method=parse>
14/08/07 09:38:49 INFO parse.ParseDriver: Parsing command: show tables
14/08/07 09:38:49 INFO parse.ParseDriver: Parse Completed
14/08/07 09:38:49 INFO ql.Driver: </PERFLOG method=parse
start=1407404329052 end=1407404329052 duration=0>
14/08/07 09:38:49 INFO ql.Driver: <PERFLOG method=semanticAnalyze>
14/08/07 09:38:49 INFO ql.Driver: Semantic Analysis Completed
14/08/07 09:38:49 INFO ql.Driver: </PERFLOG method=semanticAnalyze
start=1407404329052 end=1407404329189 duration=137>
14/08/07 09:38:49 INFO exec.ListSinkOperator: Initializing Self 0 OP
14/08/07 09:38:49 INFO exec.ListSinkOperator: Operator 0 OP initialized
14/08/07 09:38:49 INFO exec.ListSinkOperator: Initialization Done 0 OP
14/08/07 09:38:49 INFO ql.Driver: Returning Hive schema:
Schema(fieldSchemas:[FieldSchema(name:tab_name, type:string, comment:from
deserializer)], properties:null)
14/08/07 09:38:49 INFO ql.Driver: </PERFLOG method=compile
start=1407404329028 end=1407404329292 duration=264>
14/08/07 09:38:49 INFO ql.Driver: <PERFLOG method=Driver.execute>
14/08/07 09:38:49 INFO Configuration.deprecation: mapred.job.name is
deprecated. Instead, use mapreduce.job.name
14/08/07 09:38:49 INFO ql.Driver: Starting command: show tables
14/08/07 09:38:49 INFO ql.Driver: </PERFLOG method=TimeToSubmit
start=1407404329027 end=1407404329310 duration=283>
14/08/07 09:38:49 INFO ql.Driver: <PERFLOG method=runTasks>
14/08/07 09:38:49 INFO ql.Driver: <PERFLOG method=task.DDL.Stage-0>
14/08/07 09:38:49 INFO metastore.HiveMetaStore: 0: Opening raw store with
implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
14/08/07 09:38:49 INFO metastore.ObjectStore: ObjectStore, initialize called
14/08/07 09:38:49 WARN DataNucleus.General: Plugin (Bundle)
"org.datanucleus" is already registered. Ensure you dont have multiple JAR
versions of the same plugin in the classpath. The URL
"file:/home/hadoop/.versions/spark-1.0.2-bin-2.4.0/lib/datanucleus-core-3.2.2.jar"
is already registered, and you are trying to register an identical plugin
located at URL "file:/home/hadoop/spark/lib/datanucleus-core-3.2.2.jar."
14/08/07 09:38:49 WARN DataNucleus.General: Plugin (Bundle)
"org.datanucleus.store.rdbms" is already registered. Ensure you dont have
multiple JAR versions of the same plugin in the classpath. The URL
"file:/home/hadoop/.versions/spark-1.0.2-bin-2.4.0/lib/datanucleus-rdbms-3.2.1.jar"
is already registered, and you are trying to register an identical plugin
located at URL "file:/home/hadoop/spark/lib/datanucleus-rdbms-3.2.1.jar."
14/08/07 09:38:49 WARN DataNucleus.General: Plugin (Bundle)
"org.datanucleus.api.jdo" is already registered. Ensure you dont have
multiple JAR versions of the same plugin in the classpath. The URL
"file:/home/hadoop/spark/lib/datanucleus-api-jdo-3.2.1.jar" is already
registered, and you are trying to register an identical plugin located at
URL
"file:/home/hadoop/.versions/spark-1.0.2-bin-2.4.0/lib/datanucleus-api-jdo-3.2.1.jar."
14/08/07 09:38:49 INFO DataNucleus.Persistence: Property
datanucleus.cache.level2 unknown - will be ignored
14/08/07 09:38:50 ERROR exec.DDLTask:
org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.RuntimeException: Unable to instantiate
org.apache.hadoop.hive.metastore.HiveMetaStoreClient
        at
org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1143)
        at
org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1128)
        at
org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:2236)
        at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:333)
        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)
        at
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)
        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1414)
        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1192)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1020)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:888)
        at
org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:189)
        at
org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:163)
        at
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
        at
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
        at
org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:38)
        at
org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd$lzycompute(HiveContext.scala:250)
        at
org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd(HiveContext.scala:250)
        at
org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
        at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:104)
        at
org.apache.spark.sql.hive.HiveContext.hiveql(HiveContext.scala:75)
        at org.apache.spark.sql.hive.HiveContext.hql(HiveContext.scala:78)
        at $line10.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:18)
        at $line10.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:23)
        at $line10.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:25)
        at $line10.$read$$iwC$$iwC$$iwC.<init>(<console>:27)
        at $line10.$read$$iwC$$iwC.<init>(<console>:29)
        at $line10.$read$$iwC.<init>(<console>:31)
        at $line10.$read.<init>(<console>:33)
        at $line10.$read$.<init>(<console>:37)
        at $line10.$read$.<clinit>(<console>)
        at $line10.$eval$.<init>(<console>:7)
        at $line10.$eval$.<clinit>(<console>)
        at $line10.$eval.$print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:788)
        at
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1056)
        at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:614)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:645)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:609)
        at
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:796)
        at
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:841)
        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:753)
        at
org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:601)
        at
org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:608)
        at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:611)
        at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:936)
        at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
        at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
        at
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:884)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:982)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at
org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:303)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.RuntimeException: Unable to instantiate
org.apache.hadoop.hive.metastore.HiveMetaStoreClient
        at
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1212)
        at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)
        at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
        at
org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2372)
        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2383)
        at
org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1139)
        ... 62 more
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1210)
        ... 67 more
Caused by: javax.jdo.JDOFatalInternalException: Error creating
transactional connection factory
NestedThrowables:
java.lang.reflect.InvocationTargetException
        at
org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:587)
        at
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:781)
        at
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:326)
        at
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:195)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
        at
javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
        at
javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
        at
javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
        at
org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:275)
        at
org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:304)
        at
org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:234)
        at
org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:209)
        at
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
        at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
        at
org.apache.hadoop.hive.metastore.RetryingRawStore.<init>(RetryingRawStore.java:64)
        at
org.apache.hadoop.hive.metastore.RetryingRawStore.getProxy(RetryingRawStore.java:73)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:415)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:402)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:441)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:326)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:286)
        at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54)
        at
org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)
        at
org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4060)
        at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:121)
        ... 72 more
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at
org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
        at
org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:325)
        at
org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(AbstractStoreManager.java:281)
        at
org.datanucleus.store.AbstractStoreManager.<init>(AbstractStoreManager.java:239)
        at
org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:292)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at
org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
        at
org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
        at
org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1069)
        at
org.datanucleus.NucleusContext.initialise(NucleusContext.java:359)
        at
org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:768)
        ... 101 more
Caused by: java.lang.IllegalAccessError: tried to access method
com.google.common.collect.MapMaker.makeComputingMap(Lcom/google/common/base/Function;)Ljava/util/concurrent/ConcurrentMap;
from class com.jolbox.bonecp.BoneCPDataSource
        at
com.jolbox.bonecp.BoneCPDataSource.<init>(BoneCPDataSource.java:64)
        at
org.datanucleus.store.rdbms.datasource.BoneCPDataSourceFactory.makePooledDataSource(BoneCPDataSourceFactory.java:73)
        at
org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:217)
        at
org.datanucleus.store.rdbms.ConnectionFactoryImpl.initialiseDataSources(ConnectionFactoryImpl.java:110)
        at
org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryImpl.java:82)
        ... 119 more

14/08/07 09:38:50 INFO ql.Driver: </PERFLOG method=task.DDL.Stage-0
start=1407404329311 end=1407404330131 duration=820>
14/08/07 09:38:50 ERROR ql.Driver: FAILED: Execution Error, return code 1
from org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.RuntimeException:
Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
14/08/07 09:38:50 INFO ql.Driver: </PERFLOG method=Driver.execute
start=1407404329292 end=1407404330135 duration=843>
14/08/07 09:38:50 INFO ql.Driver: <PERFLOG method=releaseLocks>
14/08/07 09:38:50 INFO ql.Driver: </PERFLOG method=releaseLocks
start=1407404330135 end=1407404330135 duration=0>
14/08/07 09:38:50 INFO ql.Driver: <PERFLOG method=releaseLocks>
14/08/07 09:38:50 INFO ql.Driver: </PERFLOG method=releaseLocks
start=1407404330136 end=1407404330136 duration=0>
14/08/07 09:38:50 ERROR hive.HiveContext:
======================
HIVE FAILURE OUTPUT
======================
FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.RuntimeException: Unable
to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

======================
END HIVE FAILURE OUTPUT
======================

org.apache.spark.sql.execution.QueryExecutionException: FAILED: Execution
Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask.
java.lang.RuntimeException: Unable to instantiate
org.apache.hadoop.hive.metastore.HiveMetaStoreClient
        at
org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:193)
        at
org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:163)
        at
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
        at
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
        at
org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:38)
        at
org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd$lzycompute(HiveContext.scala:250)
        at
org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd(HiveContext.scala:250)
        at
org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
        at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:104)
        at
org.apache.spark.sql.hive.HiveContext.hiveql(HiveContext.scala:75)
        at org.apache.spark.sql.hive.HiveContext.hql(HiveContext.scala:78)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:18)
        at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:23)
        at $iwC$$iwC$$iwC$$iwC.<init>(<console>:25)
        at $iwC$$iwC$$iwC.<init>(<console>:27)
        at $iwC$$iwC.<init>(<console>:29)
        at $iwC.<init>(<console>:31)
        at <init>(<console>:33)
        at .<init>(<console>:37)
        at .<clinit>(<console>)
        at .<init>(<console>:7)
        at .<clinit>(<console>)
        at $print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:788)
        at
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1056)
        at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:614)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:645)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:609)
        at
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:796)
        at
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:841)
        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:753)
        at
org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:601)
        at
org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:608)
        at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:611)
        at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:936)
        at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
        at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
        at
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:884)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:982)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at
org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:303)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)


--
Shen Zhun
Data Mining at LightnInTheBox.com
Email: shenzhunal...@gmail.com | shenz...@yahoo.com
Phone: 186 0627 7769
GitHub: https://github.com/shenzhun
LinkedIn: http://www.linkedin.com/in/shenzhun

Reply via email to