Was there an answer to this? I get this periodically when a job has died from an error and I run another job. I have gotten around it by going to /var/lib/hive/metastore/metastore_db and removing the *.lck files. I am sure this is the exact wrong thing to do as I imagine those lock files exist to prevent corruption, but I haven't found another way to get around the situation.
On Thu, Sep 22, 2016 at 5:59 PM, jypucca <jypu...@gmail.com> wrote: > > I installed Spark 2.0.0, and was trying the ML example (IndexToString) on > this web > page:http://spark.apache.org/docs/latest/ml-features.html#onehotencoder, > using jupyter notebook (running Pyspark) to create a simple dataframe, and > I > keep getting a long error message (see below). Pyspark has worked fine with > RDD, but anytime I try to do anything with DataFrame it keep throwing out > error messages. Any help would be appreciated, thanks! > > ******************************************************************* > Py4JJavaError: An error occurred while calling o23.applySchemaToPythonRDD. > : java.lang.RuntimeException: java.lang.RuntimeException: Unable to > instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient > at > org.apache.hadoop.hive.ql.session.SessionState.start( > SessionState.java:522) > at > org.apache.spark.sql.hive.client.HiveClientImpl.<init>( > HiveClientImpl.scala:171) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance( > NativeConstructorAccessorImpl.java:62) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance( > DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > at > org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient( > IsolatedClientLoader.scala:258) > at > org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata( > HiveUtils.scala:359) > at > org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata( > HiveUtils.scala:263) > at > org.apache.spark.sql.hive.HiveSharedState.metadataHive$ > lzycompute(HiveSharedState.scala:39) > at > org.apache.spark.sql.hive.HiveSharedState.metadataHive( > HiveSharedState.scala:38) > at > org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute( > HiveSharedState.scala:46) > at > org.apache.spark.sql.hive.HiveSharedState.externalCatalog( > HiveSharedState.scala:45) > at > org.apache.spark.sql.hive.HiveSessionState.catalog$ > lzycompute(HiveSessionState.scala:50) > at > org.apache.spark.sql.hive.HiveSessionState.catalog( > HiveSessionState.scala:48) > at > org.apache.spark.sql.hive.HiveSessionState$$anon$1.< > init>(HiveSessionState.scala:63) > at > org.apache.spark.sql.hive.HiveSessionState.analyzer$ > lzycompute(HiveSessionState.scala:63) > at > org.apache.spark.sql.hive.HiveSessionState.analyzer( > HiveSessionState.scala:62) > at > org.apache.spark.sql.execution.QueryExecution. > assertAnalyzed(QueryExecution.scala:49) > at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64) > at > org.apache.spark.sql.SparkSession.applySchemaToPythonRDD( > SparkSession.scala:666) > at > org.apache.spark.sql.SparkSession.applySchemaToPythonRDD( > SparkSession.scala:656) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java: > 62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke( > DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:237) > at py4j.reflection.ReflectionEngine.invoke( > ReflectionEngine.java:357) > at py4j.Gateway.invoke(Gateway.java:280) > at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand. > java:128) > at py4j.commands.CallCommand.execute(CallCommand.java:79) > at py4j.GatewayConnection.run(GatewayConnection.java:211) > at java.lang.Thread.run(Thread.java:745) > Caused by: java.lang.RuntimeException: Unable to instantiate > org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient > at > org.apache.hadoop.hive.metastore.MetaStoreUtils. > newInstance(MetaStoreUtils.java:1523) > at > org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init> > (RetryingMetaStoreClient.java:86) > at > org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy( > RetryingMetaStoreClient.java:132) > at > org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy( > RetryingMetaStoreClient.java:104) > at > org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive. > java:3005) > at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024) > at > org.apache.hadoop.hive.ql.session.SessionState.start( > SessionState.java:503) > ... 32 more > Caused by: java.lang.reflect.InvocationTargetException > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance( > NativeConstructorAccessorImpl.java:62) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance( > DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > at > org.apache.hadoop.hive.metastore.MetaStoreUtils. > newInstance(MetaStoreUtils.java:1521) > ... 38 more > Caused by: javax.jdo.JDOFatalDataStoreException: Unable to open a test > connection to the given database. JDBC url = > jdbc:derby:;databaseName=metastore_db;create=true, username = APP. > Terminating connection pool (set lazyInit to true if you expect to start > your database after your app). Original Exception: ------ > java.sql.SQLException: Failed to start database 'metastore_db' with class > loader > org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@d187feb, see > the next exception for details. > at org.apache.derby.impl.jdbc.SQLExceptionFactory. > getSQLException(Unknown > Source) > at org.apache.derby.impl.jdbc.SQLExceptionFactory. > getSQLException(Unknown > Source) > at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown > Source) > at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown > Source) > at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown > Source) > at org.apache.derby.jdbc.InternalDriver. > getNewEmbedConnection(Unknown > Source) > at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source) > at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source) > at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source) > at java.sql.DriverManager.getConnection(DriverManager.java:664) > at java.sql.DriverManager.getConnection(DriverManager.java:208) > at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection( > BoneCP.java:361) > at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416) > at > com.jolbox.bonecp.BoneCPDataSource.getConnection( > BoneCPDataSource.java:120) > at > org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl. > getConnection(ConnectionFactoryImpl.java:501) > at > org.datanucleus.store.rdbms.RDBMSStoreManager.<init>( > RDBMSStoreManager.java:298) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance( > NativeConstructorAccessorImpl.java:62) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance( > DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > at > org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension( > NonManagedPluginRegistry.java:631) > at > org.datanucleus.plugin.PluginManager.createExecutableExtension( > PluginManager.java:301) > at > org.datanucleus.NucleusContext.createStoreManagerForPropertie > s(NucleusContext.java:1187) > at org.datanucleus.NucleusContext.initialise( > NucleusContext.java:356) > at > org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration( > JDOPersistenceManagerFactory.java:775) > at > org.datanucleus.api.jdo.JDOPersistenceManagerFactory. > createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333) > at > org.datanucleus.api.jdo.JDOPersistenceManagerFactory. > getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java: > 62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke( > DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965) > at java.security.AccessController.doPrivileged(Native Method) > at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960) > at > javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation( > JDOHelper.java:1166) > at javax.jdo.JDOHelper.getPersistenceManagerFactory( > JDOHelper.java:808) > at javax.jdo.JDOHelper.getPersistenceManagerFactory( > JDOHelper.java:701) > at > org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365) > at > org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager( > ObjectStore.java:394) > at > org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java: > 291) > at > org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258) > at org.apache.hadoop.util.ReflectionUtils.setConf( > ReflectionUtils.java:76) > at > org.apache.hadoop.util.ReflectionUtils.newInstance( > ReflectionUtils.java:136) > at > org.apache.hadoop.hive.metastore.RawStoreProxy.<init> > (RawStoreProxy.java:57) > at > org.apache.hadoop.hive.metastore.RawStoreProxy. > getProxy(RawStoreProxy.java:66) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore( > HiveMetaStore.java:593) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS( > HiveMetaStore.java:571) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB( > HiveMetaStore.java:624) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$ > HMSHandler.init(HiveMetaStore.java:461) > at > org.apache.hadoop.hive.metastore.RetryingHMSHandler.< > init>(RetryingHMSHandler.java:66) > at > org.apache.hadoop.hive.metastore.RetryingHMSHandler. > getProxy(RetryingHMSHandler.java:72) > at > org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler( > HiveMetaStore.java:5762) > at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient. > <init>(HiveMetaStoreClient.java:199) > at > org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>( > SessionHiveMetaStoreClient.java:74) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance( > NativeConstructorAccessorImpl.java:62) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance( > DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > at > org.apache.hadoop.hive.metastore.MetaStoreUtils. > newInstance(MetaStoreUtils.java:1521) > at > org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init> > (RetryingMetaStoreClient.java:86) > at > org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy( > RetryingMetaStoreClient.java:132) > at > org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy( > RetryingMetaStoreClient.java:104) > at > org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive. > java:3005) > at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024) > at > org.apache.hadoop.hive.ql.session.SessionState.start( > SessionState.java:503) > at > org.apache.spark.sql.hive.client.HiveClientImpl.<init>( > HiveClientImpl.scala:171) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance( > NativeConstructorAccessorImpl.java:62) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance( > DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > at > org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient( > IsolatedClientLoader.scala:258) > at > org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata( > HiveUtils.scala:359) > at > org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata( > HiveUtils.scala:263) > at > org.apache.spark.sql.hive.HiveSharedState.metadataHive$ > lzycompute(HiveSharedState.scala:39) > at > org.apache.spark.sql.hive.HiveSharedState.metadataHive( > HiveSharedState.scala:38) > at > org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute( > HiveSharedState.scala:46) > at > org.apache.spark.sql.hive.HiveSharedState.externalCatalog( > HiveSharedState.scala:45) > at > org.apache.spark.sql.hive.HiveSessionState.catalog$ > lzycompute(HiveSessionState.scala:50) > at > org.apache.spark.sql.hive.HiveSessionState.catalog( > HiveSessionState.scala:48) > at > org.apache.spark.sql.hive.HiveSessionState$$anon$1.< > init>(HiveSessionState.scala:63) > at > org.apache.spark.sql.hive.HiveSessionState.analyzer$ > lzycompute(HiveSessionState.scala:63) > at > org.apache.spark.sql.hive.HiveSessionState.analyzer( > HiveSessionState.scala:62) > at > org.apache.spark.sql.execution.QueryExecution. > assertAnalyzed(QueryExecution.scala:49) > at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64) > at > org.apache.spark.sql.SparkSession.applySchemaToPythonRDD( > SparkSession.scala:666) > at > org.apache.spark.sql.SparkSession.applySchemaToPythonRDD( > SparkSession.scala:656) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java: > 62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke( > DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:237) > at py4j.reflection.ReflectionEngine.invoke( > ReflectionEngine.java:357) > at py4j.Gateway.invoke(Gateway.java:280) > at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand. > java:128) > at py4j.commands.CallCommand.execute(CallCommand.java:79) > at py4j.GatewayConnection.run(GatewayConnection.java:211) > at java.lang.Thread.run(Thread.java:745) > Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class > loader > org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@d187feb, see > the next exception for details. > at org.apache.derby.iapi.error.StandardException. > newException(Unknown > Source) > at > org.apache.derby.impl.jdbc.SQLExceptionFactory. > wrapArgsForTransportAcrossDRDA(Unknown > Source) > ... 97 more > Caused by: ERROR XSDB6: Another instance of Derby may have already booted > the database C:\Sparkcourse\metastore_db. > at org.apache.derby.iapi.error.StandardException. > newException(Unknown > Source) > at org.apache.derby.iapi.error.StandardException. > newException(Unknown > Source) > at > org.apache.derby.impl.store.raw.data.BaseDataFileFactory. > privGetJBMSLockOnDB(Unknown > Source) > at org.apache.derby.impl.store.raw.data.BaseDataFileFactory. > run(Unknown > Source) > at java.security.AccessController.doPrivileged(Native Method) > at > org.apache.derby.impl.store.raw.data.BaseDataFileFactory. > getJBMSLockOnDB(Unknown > Source) > at org.apache.derby.impl.store.raw.data.BaseDataFileFactory. > boot(Unknown > Source) > at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown > Source) > at org.apache.derby.impl.services.monitor.TopService. > bootModule(Unknown > Source) > at org.apache.derby.impl.services.monitor.BaseMonitor. > startModule(Unknown > Source) > at org.apache.derby.impl.services.monitor.FileMonitor. > startModule(Unknown > Source) > at org.apache.derby.iapi.services.monitor.Monitor. > bootServiceModule(Unknown > Source) > at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source) > at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown > Source) > at org.apache.derby.impl.services.monitor.TopService. > bootModule(Unknown > Source) > at org.apache.derby.impl.services.monitor.BaseMonitor. > startModule(Unknown > Source) > at org.apache.derby.impl.services.monitor.FileMonitor. > startModule(Unknown > Source) > at org.apache.derby.iapi.services.monitor.Monitor. > bootServiceModule(Unknown > Source) > at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown > Source) > at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown > Source) > at org.apache.derby.impl.services.monitor.TopService. > bootModule(Unknown > Source) > at org.apache.derby.impl.services.monitor.BaseMonitor. > startModule(Unknown > Source) > at org.apache.derby.impl.services.monitor.FileMonitor. > startModule(Unknown > Source) > at org.apache.derby.iapi.services.monitor.Monitor. > bootServiceModule(Unknown > Source) > at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown > Source) > at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source) > at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown > Source) > at org.apache.derby.impl.services.monitor.TopService. > bootModule(Unknown > Source) > at org.apache.derby.impl.services.monitor.BaseMonitor. > bootService(Unknown > Source) > at > org.apache.derby.impl.services.monitor.BaseMonitor. > startProviderService(Unknown > Source) > at > org.apache.derby.impl.services.monitor.BaseMonitor. > findProviderAndStartService(Unknown > Source) > at > org.apache.derby.impl.services.monitor.BaseMonitor. > startPersistentService(Unknown > Source) > at > org.apache.derby.iapi.services.monitor.Monitor. > startPersistentService(Unknown > Source) > ... 94 more > ------ > > NestedThrowables: > java.sql.SQLException: Unable to open a test connection to the given > database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, > username = APP. Terminating connection pool (set lazyInit to true if you > expect to start your database after your app). Original Exception: ------ > java.sql.SQLException: Failed to start database 'metastore_db' with class > loader > org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@d187feb, see > the next exception for details. > at org.apache.derby.impl.jdbc.SQLExceptionFactory. > getSQLException(Unknown > Source) > at org.apache.derby.impl.jdbc.SQLExceptionFactory. > getSQLException(Unknown > Source) > at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown > Source) > at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown > Source) > at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown > Source) > at org.apache.derby.jdbc.InternalDriver. > getNewEmbedConnection(Unknown > Source) > at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source) > at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source) > at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source) > at java.sql.DriverManager.getConnection(DriverManager.java:664) > at java.sql.DriverManager.getConnection(DriverManager.java:208) > at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection( > BoneCP.java:361) > at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416) > at > com.jolbox.bonecp.BoneCPDataSource.getConnection( > BoneCPDataSource.java:120) > at > org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl. > getConnection(ConnectionFactoryImpl.java:501) > at > org.datanucleus.store.rdbms.RDBMSStoreManager.<init>( > RDBMSStoreManager.java:298) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance( > NativeConstructorAccessorImpl.java:62) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance( > DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > at > org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension( > NonManagedPluginRegistry.java:631) > at > org.datanucleus.plugin.PluginManager.createExecutableExtension( > PluginManager.java:301) > at > org.datanucleus.NucleusContext.createStoreManagerForPropertie > s(NucleusContext.java:1187) > at org.datanucleus.NucleusContext.initialise( > NucleusContext.java:356) > at > org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration( > JDOPersistenceManagerFactory.java:775) > at > org.datanucleus.api.jdo.JDOPersistenceManagerFactory. > createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333) > at > org.datanucleus.api.jdo.JDOPersistenceManagerFactory. > getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java: > 62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke( > DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965) > at java.security.AccessController.doPrivileged(Native Method) > at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960) > at > javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation( > JDOHelper.java:1166) > at javax.jdo.JDOHelper.getPersistenceManagerFactory( > JDOHelper.java:808) > at javax.jdo.JDOHelper.getPersistenceManagerFactory( > JDOHelper.java:701) > at > org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365) > at > org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager( > ObjectStore.java:394) > at > org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java: > 291) > at > org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258) > at org.apache.hadoop.util.ReflectionUtils.setConf( > ReflectionUtils.java:76) > at > org.apache.hadoop.util.ReflectionUtils.newInstance( > ReflectionUtils.java:136) > at > org.apache.hadoop.hive.metastore.RawStoreProxy.<init> > (RawStoreProxy.java:57) > at > org.apache.hadoop.hive.metastore.RawStoreProxy. > getProxy(RawStoreProxy.java:66) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore( > HiveMetaStore.java:593) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS( > HiveMetaStore.java:571) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB( > HiveMetaStore.java:624) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$ > HMSHandler.init(HiveMetaStore.java:461) > at > org.apache.hadoop.hive.metastore.RetryingHMSHandler.< > init>(RetryingHMSHandler.java:66) > at > org.apache.hadoop.hive.metastore.RetryingHMSHandler. > getProxy(RetryingHMSHandler.java:72) > at > org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler( > HiveMetaStore.java:5762) > at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient. > <init>(HiveMetaStoreClient.java:199) > at > org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>( > SessionHiveMetaStoreClient.java:74) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance( > NativeConstructorAccessorImpl.java:62) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance( > DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > at > org.apache.hadoop.hive.metastore.MetaStoreUtils. > newInstance(MetaStoreUtils.java:1521) > at > org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init> > (RetryingMetaStoreClient.java:86) > at > org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy( > RetryingMetaStoreClient.java:132) > at > org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy( > RetryingMetaStoreClient.java:104) > at > org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive. > java:3005) > at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024) > at > org.apache.hadoop.hive.ql.session.SessionState.start( > SessionState.java:503) > at > org.apache.spark.sql.hive.client.HiveClientImpl.<init>( > HiveClientImpl.scala:171) > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance( > NativeConstructorAccessorImpl.java:62) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance( > DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > at > org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient( > IsolatedClientLoader.scala:258) > at > org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata( > HiveUtils.scala:359) > at > org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata( > HiveUtils.scala:263) > at > org.apache.spark.sql.hive.HiveSharedState.metadataHive$ > lzycompute(HiveSharedState.scala:39) > at > org.apache.spark.sql.hive.HiveSharedState.metadataHive( > HiveSharedState.scala:38) > at > org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute( > HiveSharedState.scala:46) > at > org.apache.spark.sql.hive.HiveSharedState.externalCatalog( > HiveSharedState.scala:45) > at > org.apache.spark.sql.hive.HiveSessionState.catalog$ > lzycompute(HiveSessionState.scala:50) > at > org.apache.spark.sql.hive.HiveSessionState.catalog( > HiveSessionState.scala:48) > at > org.apache.spark.sql.hive.HiveSessionState$$anon$1.< > init>(HiveSessionState.scala:63) > at > org.apache.spark.sql.hive.HiveSessionState.analyzer$ > lzycompute(HiveSessionState.scala:63) > at > org.apache.spark.sql.hive.HiveSessionState.analyzer( > HiveSessionState.scala:62) > at > org.apache.spark.sql.execution.QueryExecution. > assertAnalyzed(QueryExecution.scala:49) > at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64) > at > org.apache.spark.sql.SparkSession.applySchemaToPythonRDD( > SparkSession.scala:666) > at > org.apache.spark.sql.SparkSession.applySchemaToPythonRDD( > SparkSession.scala:656) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java: > 62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke( > DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:237) > at py4j.reflection.ReflectionEngine.invoke( > ReflectionEngine.java:357) > at py4j.Gateway.invoke(Gateway.java:280) > at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand. > java:128) > at py4j.commands.CallCommand.execute(CallCommand.java:79) > at py4j.GatewayConnection.run(GatewayConnection.java:211) > at java.lang.Thread.run(Thread.java:745) > Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class > loader > org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@d187feb, see > the next exception for details. > at org.apache.derby.iapi.error.StandardException. > newException(Unknown > Source) > at > org.apache.derby.impl.jdbc.SQLExceptionFactory. > wrapArgsForTransportAcrossDRDA(Unknown > Source) > ... 97 more > Caused by: ERROR XSDB6: Another instance of Derby may have already booted > the database C:\Sparkcourse\metastore_db. > at org.apache.derby.iapi.error.StandardException. > newException(Unknown > Source) > at org.apache.derby.iapi.error.StandardException. > newException(Unknown > Source) > at > org.apache.derby.impl.store.raw.data.BaseDataFileFactory. > privGetJBMSLockOnDB(Unknown > Source) > at org.apache.derby.impl.store.raw.data.BaseDataFileFactory. > run(Unknown > Source) > at java.security.AccessController.doPrivileged(Native Method) > at > org.apache.derby.impl.store.raw.data.BaseDataFileFactory. > getJBMSLockOnDB(Unknown > Source) > at org.apache.derby.impl.store.raw.data.BaseDataFileFactory. > boot(Unknown > Source) > at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown > Source) > at org.apache.derby.impl.services.monitor.TopService. > bootModule(Unknown > Source) > at org.apache.derby.impl.services.monitor.BaseMonitor. > startModule(Unknown > Source) > at org.apache.derby.impl.services.monitor.FileMonitor. > startModule(Unknown > Source) > at org.apache.derby.iapi.services.monitor.Monitor. > bootServiceModule(Unknown > Source) > at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source) > at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown > Source) > at org.apache.derby.impl.services.monitor.TopService. > bootModule(Unknown > Source) > at org.apache.derby.impl.services.monitor.BaseMonitor. > startModule(Unknown > Source) > at org.apache.derby.impl.services.monitor.FileMonitor. > startModule(Unknown > Source) > at org.apache.derby.iapi.services.monitor.Monitor. > bootServiceModule(Unknown > Source) > at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown > Source) > at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown > Source) > at org.apache.derby.impl.services.monitor.TopService. > bootModule(Unknown > Source) > at org.apache.derby.impl.services.monitor.BaseMonitor. > startModule(Unknown > Source) > at org.apache.derby.impl.services.monitor.FileMonitor. > startModule(Unknown > Source) > at org.apache.derby.iapi.services.monitor.Monitor. > bootServiceModule(Unknown > Source) > at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown > Source) > at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source) > at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown > Source) > at org.apache.derby.impl.services.monitor.TopService. > bootModule(Unknown > Source) > at org.apache.derby.impl.services.monitor.BaseMonitor. > bootService(Unknown > Source) > at > org.apache.derby.impl.services.monitor.BaseMonitor. > startProviderService(Unknown > Source) > at > org.apache.derby.impl.services.monitor.BaseMonitor. > findProviderAndStartService(Unknown > Source) > at > org.apache.derby.impl.services.monitor.BaseMonitor. > startPersistentService(Unknown > Source) > at > org.apache.derby.iapi.services.monitor.Monitor. > startPersistentService(Unknown > Source) > ... 94 more > ------ > > at > org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusExcep > tion(NucleusJDOHelper.java:436) > at > org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration( > JDOPersistenceManagerFactory.java:788) > at > org.datanucleus.api.jdo.JDOPersistenceManagerFactory. > createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333) > at > org.datanucleus.api.jdo.JDOPersistenceManagerFactory. > getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java: > 62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke( > DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965) > at java.security.AccessController.doPrivileged(Native Method) > at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960) > at > javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation( > JDOHelper.java:1166) > at javax.jdo.JDOHelper.getPersistenceManagerFactory( > JDOHelper.java:808) > at javax.jdo.JDOHelper.getPersistenceManagerFactory( > JDOHelper.java:701) > at > org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365) > at > org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager( > ObjectStore.java:394) > at > org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java: > 291) > at > org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258) > at org.apache.hadoop.util.ReflectionUtils.setConf( > ReflectionUtils.java:76) > at > org.apache.hadoop.util.ReflectionUtils.newInstance( > ReflectionUtils.java:136) > at > org.apache.hadoop.hive.metastore.RawStoreProxy.<init> > (RawStoreProxy.java:57) > at > org.apache.hadoop.hive.metastore.RawStoreProxy. > getProxy(RawStoreProxy.java:66) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore( > HiveMetaStore.java:593) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS( > HiveMetaStore.java:571) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB( > HiveMetaStore.java:624) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$ > HMSHandler.init(HiveMetaStore.java:461) > at > org.apache.hadoop.hive.metastore.RetryingHMSHandler.< > init>(RetryingHMSHandler.java:66) > at > org.apache.hadoop.hive.metastore.RetryingHMSHandler. > getProxy(RetryingHMSHandler.java:72) > at > org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler( > HiveMetaStore.java:5762) > at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient. > <init>(HiveMetaStoreClient.java:199) > at > org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>( > SessionHiveMetaStoreClient.java:74) > ... 43 more > > > > > -- > View this message in context: http://apache-spark-user-list. > 1001560.n3.nabble.com/pyspark-ML-example-not-working-tp27780.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe e-mail: user-unsubscr...@spark.apache.org > >