thats why mapreduce.util.HostUtil is from mr2, and you use (i think) mr1, hive has dummy code : common/java/org/apache/hadoop/hive/shims/ShimLoader.java ->
> String getMajorVersion() { > String vers = VersionInfo.getVersion(); > thus you gets something like : 2.0.0-cdh4.1.0, instead you use old hadoop for example : 2.0.0-mr1-cdh4.1.0 i will write to cloudera to fix it, but simple fix is to fake: HADOOP_SHIM_CLASSES.put("0.23", "org.apache.hadoop.hive.shims.Hadoop20SShims"); not HADOOP_SHIM_CLASSES.put("0.23", "org.apache.hadoop.hive.shims.Hadoop23Shims"); > > среда, 27 февраля 2013 г., 6:39:36 UTC+4 пользователь Eric Chu написал: > > (+hue-user since this issue prevents me from successfully installing Hue > from source) > > Hi, > > I recently did the following with both the Hive-0.10 and Hive-0.9, and had > a problem with 0.10 that I didn't see with 0.9 > > - Checked out the respective branch from github > - Did an "ant package" > - Copied the dist folder to /usr/lib/hive on the right machine > - Copied mysql-connector-java-5.1.22-bin.jar to /usr/lib/hive/lib > - Configured /etc/hive/conf (so same for both versions) > > The problem is that when I use Hive-0.10, doing a "select count(1) from > table" (or anything that requires MR) would return a NoClassDefFound error > (see *Error Msg *below), whereas when I use Hive-0.9, the job would run > fine. Has anyone run into this problem? I can't use Hive-0.9 b/c it has > libthrift-0.7.0.jar, while Hue-2.2 (which I'm also using) requires > libthrift-0.9.0.jar (available in Hive-10). > > Any insights would be much appreciated. Googling on this error doesn't get > very far. Thanks! > > *Error Msg:* > > Total MapReduce jobs = 1 > Launching Job 1 out of 1 > Number of reduce tasks determined at compile time: 1 > In order to change the average load for a reducer (in bytes): > set hive.exec.reducers.bytes.per.reducer=<number> > In order to limit the maximum number of reducers: > set hive.exec.reducers.max=<number> > In order to set a constant number of reducers: > set mapred.reduce.tasks=<number> > Starting Job = job_201302201756_0009, Tracking URL = > http://master-hadoop.pww-arp-dev.rfiserve.net:50030/jobdetails.jsp?jobid=job_201302201756_0009 > Kill Command = /usr/lib/hadoop/bin/hadoop job -kill job_201302201756_0009 > Hadoop job information for Stage-1: number of mappers: 1; number of > reducers: 1 > 2013-02-26 21:05:01,060 Stage-1 map = 0%, reduce = 0% > 2013-02-26 21:05:32,286 Stage-1 map = 100%, reduce = 100% > Ended Job = job_201302201756_0009 with errors > Error during job, obtaining debugging information... > Job Tracking URL: > http://master-hadoop.pww-arp-dev.rfiserve.net:50030/jobdetails.jsp?jobid=job_201302201756_0009 > Examining task ID: task_201302201756_0009_m_000002 (and more) from job > job_201302201756_0009 > Exception in thread "Thread-29" java.lang.NoClassDefFoundError: > org/apache/hadoop/mapreduce/util/HostUtil > at > org.apache.hadoop.hive.shims.Hadoop23Shims.getTaskAttemptLogUrl(Hadoop23Shims.java:53) > at > org.apache.hadoop.hive.ql.exec.JobDebugger$TaskInfoGrabber.getTaskInfos(JobDebugger.java:186) > at > org.apache.hadoop.hive.ql.exec.JobDebugger$TaskInfoGrabber.run(JobDebugger.java:142) > at java.lang.Thread.run(Thread.java:619) > Caused by: java.lang.ClassNotFoundException: > org.apache.hadoop.mapreduce.util.HostUtil > at java.net.URLClassLoader$1.run(URLClassLoader.java:200) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(URLClassLoader.java:188) > at java.lang.ClassLoader.loadClass(ClassLoader.java:307) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) > at java.lang.ClassLoader.loadClass(ClassLoader.java:252) > at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320) > ... 4 more > FAILED: Execution Error, return code 2 from > org.apache.hadoop.hive.ql.exec.MapRedTask > MapReduce Jobs Launched: > Job 0: Map: 1 Reduce: 1 HDFS Read: 0 HDFS Write: 0 FAIL > Total MapReduce CPU Time Spent: 0 msec > > > >