Thanks Michael!

as I run it using spark-shell, so I added both jars through bin/spark-shell
--jars options.  I noticed if I don't pass these jars, it complains it
couldn't find the driver, if I pass them through --jars options, it
complains there is no suitable driver.

Regards.


On Tue, Jun 17, 2014 at 2:43 AM, Michael Armbrust <mich...@databricks.com>
wrote:

> First a clarification:  Spark SQL does not talk to HiveServer2, as that
> JDBC interface is for retrieving results from queries that are executed
> using Hive.  Instead Spark SQL will execute queries itself by directly
> accessing your data using Spark.
>
> Spark SQL's Hive module can use JDBC to connect to an external metastore,
> in your case DB2. This is only used to retrieve the metadata (i.e., column
> names and types, HDFS locations for data)
>
> Looking at your exception I still see "java.sql.SQLException: No suitable
> driver", so my guess would be that the DB2 JDBC drivers are not being
> correctly included.  How are you trying to add them to the classpath?
>
> Michael
>
>
> On Tue, Jun 17, 2014 at 1:29 AM, Jenny Zhao <linlin200...@gmail.com>
> wrote:
>
>>
>> Hi,
>>
>> my hive configuration use db2 as it's metastore database, I have built
>> spark with the extra step sbt/sbt assembly/assembly to include the
>> dependency jars. and copied HIVE_HOME/conf/hive-site.xml under spark/conf.
>> when I ran :
>>
>> hql("CREATE TABLE IF NOT EXISTS src (key INT, value STRING)")
>>
>> got following exception, pasted portion of the stack trace here, looking
>> at the stack, this made me wondering if Spark supports remote metastore
>> configuration, it seems spark doesn't talk to hiveserver2 directly?  the
>> driver jars: db2jcc-10.5.jar, db2jcc_license_cisuz-10.5.jar both are
>> included in the classpath, otherwise, it will complain it couldn't find the
>> driver.
>>
>> Appreciate any help to resolve it.
>>
>> Thanks!
>>
>> Caused by: java.sql.SQLException: Unable to open a test connection to the
>> given database. JDBC url = jdbc:db2://localhost:50001/BIDB, username =
>> catalog. Terminating connection pool. Original Exception: ------
>> java.sql.SQLException: No suitable driver
>>         at java.sql.DriverManager.getConnection(DriverManager.java:422)
>>         at java.sql.DriverManager.getConnection(DriverManager.java:374)
>>         at
>> com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:254)
>>         at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:305)
>>         at
>> com.jolbox.bonecp.BoneCPDataSource.maybeInit(BoneCPDataSource.java:150)
>>         at
>> com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:112)
>>         at
>> org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:479)
>>         at
>> org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:304)
>>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)
>>         at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:56)
>>         at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:39)
>>         at java.lang.reflect.Constructor.newInstance(Constructor.java:527)
>>         at
>> org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
>>         at
>> org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
>>         at
>> org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1069)
>>         at
>> org.datanucleus.NucleusContext.initialise(NucleusContext.java:359)
>>         at
>> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:768)
>>         at
>> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:326)
>>         at
>> org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:195)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
>>         at java.lang.reflect.Method.invoke(Method.java:611)
>>         at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
>>         at
>> java.security.AccessController.doPrivileged(AccessController.java:277)
>>         at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
>>         at
>> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
>>         at
>> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
>>         at
>> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
>>         at
>> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:275)
>>         at
>> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:304)
>>         at
>> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:234)
>>         at
>> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:209)
>>         at
>> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
>>         at
>> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
>>         at
>> org.apache.hadoop.hive.metastore.RetryingRawStore.<init>(RetryingRawStore.java:64)
>>         at
>> org.apache.hadoop.hive.metastore.RetryingRawStore.getProxy(RetryingRawStore.java:73)
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:415)
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:402)
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:441)
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:326)
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:286)
>>         at
>> org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54)
>>         at
>> org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4060)
>>         at
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:121)
>>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)
>>         at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:56)
>>         at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:39)
>>         at java.lang.reflect.Constructor.newInstance(Constructor.java:527)
>>         at
>> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1210)
>>         at
>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)
>>         at
>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
>>         at
>> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2372)
>>         at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2383)
>>         at
>> org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1139)
>>         at
>> org.apache.hadoop.hive.ql.Driver.doAuthorization(Driver.java:542)
>>         at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:493)
>>         at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:342)
>>         at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:977)
>>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:888)
>>         at
>> org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:186)
>>         at
>> org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:160)
>>         at
>> org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd$lzycompute(HiveContext.scala:250)
>>         at
>> org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd(HiveContext.scala:247)
>>         at
>> org.apache.spark.sql.hive.HiveContext.hiveql(HiveContext.scala:85)
>>         at org.apache.spark.sql.hive.HiveContext.hql(HiveContext.scala:90)
>>
>>
>

Reply via email to