Turns out this was the same issue as the one discussed here:

http://getsatisfaction.com/cloudera/topics/hive_error_error_in_metadata_javax_jdo_jdofatalinternalexception

Removing the $HADOOP_HOME/build directory worked like a charm.

On Jun 30, 2011, at 6:02 PM, Matthieu Martin wrote:

> I was able to use Hive successfully a few weeks ago.  When I tried again 
> recently, I ran into issues and suspected that the problems may have been 
> related to Hive's metastore.  For better or worse, I tried to simply delete 
> the "metastore_db" folder and start fresh.  However, the "metastore_db" 
> folder does not seem to be automatically created I run Hive and I still see 
> the same error messages that I saw before deleting the "metastore_db" folder. 
>  
> 
> Just for reference: I'm running hadoop on Mac OS X in psuedo-distributed mode 
> and the mapreduce jobs seem to complete without any issue.  I do not see any 
> significant error or warning messages in datanode, task tracker, job tracker, 
> name node, etc.  Also, as far as I can tell, Derby is installed and none of 
> the write permissions should be an issue.
> 
> Any help would be greatly appreciated.
> 
> Thanks!
> 
> ----------------------------------------------------------
> 
> Error from the Hive log:
> 
> NestedThrowables:
> java.lang.reflect.InvocationTargetException
> org.apache.hadoop.hive.ql.metadata.HiveException: 
> javax.jdo.JDOFatalInternalException: Unexpected exception caught.
> NestedThrowables:
> java.lang.reflect.InvocationTargetException
>         at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1028)
>         at 
> org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1013)
>         at 
> org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:1691)
>         at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:289)
>         at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:130)
>         at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
>         at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1063)
>         at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:900)
>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:748)
>         at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:164)
>         at 
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:241)
>         at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:456)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:186)
> Caused by: javax.jdo.JDOFatalInternalException: Unexpected exception caught.
> NestedThrowables:
> java.lang.reflect.InvocationTargetException
>         at 
> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1186)
>         at 
> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
>         at 
> javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
>         at 
> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:234)
>         at 
> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:261)
>         at 
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:196)
>         at 
> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:171)
>         at 
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
>         at 
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
>         at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:354)
>         at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:306)
>         at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:451)
>         at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:232)
>         at 
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:197)
>         at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:108)
>         at 
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:1855)
>         at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:1865)
>         at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1024)
>         ... 16 more
> Caused by: java.lang.reflect.InvocationTargetException
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at javax.jdo.JDOHelper$16.run(JDOHelper.java:1958)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.jdo.JDOHelper.invoke(JDOHelper.java:1953)
>         at 
> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159)
>         ... 33 more
> Caused by: java.lang.NullPointerException
>         at 
> org.datanucleus.plugin.NonManagedPluginRegistry.registerBundle(NonManagedPluginRegistry.java:443)
>         at 
> org.datanucleus.plugin.NonManagedPluginRegistry.registerBundle(NonManagedPluginRegistry.java:355)
>         at 
> org.datanucleus.plugin.NonManagedPluginRegistry.registerExtensions(NonManagedPluginRegistry.java:215)
>         at 
> org.datanucleus.plugin.NonManagedPluginRegistry.registerExtensionPoints(NonManagedPluginRegistry.java:156)
>         at 
> org.datanucleus.plugin.PluginManager.registerExtensionPoints(PluginManager.java:82)
>         at org.datanucleus.OMFContext.<init>(OMFContext.java:156)
>         at org.datanucleus.OMFContext.<init>(OMFContext.java:137)
>         at 
> org.datanucleus.ObjectManagerFactoryImpl.initialiseOMFContext(ObjectManagerFactoryImpl.java:132)
>         at 
> org.datanucleus.jdo.JDOPersistenceManagerFactory.initialiseProperties(JDOPersistenceManagerFactory.java:363)
>         at 
> org.datanucleus.jdo.JDOPersistenceManagerFactory.<init>(JDOPersistenceManagerFactory.java:307)
>         at 
> org.datanucleus.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:255)
>         at 
> org.datanucleus.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:182)
>         ... 41 more
> 
> 2011-06-30 17:05:05,060 ERROR ql.Driver (SessionState.java:printError(343)) - 
> FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask
> 
> 
> 

Reply via email to