It looks more like a permissions problem to me. Just make sure that whatever directories hadoop is writing to are owned by hadoop itself.
Also it looks a little weird to me that it is using the "RawLocalFileSystem" instead of the "DistributedFileSystem". You might want to look at "fs.default.name" property in core-site.xml and see if it is pointing to your HDFS location. Hope that helps. On Thu, May 10, 2012 at 11:29 AM, Mahsa Mofidpoor <mofidp...@gmail.com>wrote: > Hi, > > When I want to join two tables, I receive the following error: > > 12/05/10 12:03:31 WARN conf.HiveConf: hive-site.xml not found on CLASSPATH > WARNING: org.apache.hadoop.metrics.jvm.EventCounter is deprecated. Please > use org.apache.hadoop.log.metrics.EventCounter in all the log4j.properties > files. > Execution log at: > /tmp/umroot/umroot_20120510120303_4d0145bb-27fa-4d4a-8cbc-95d8353fccaf.log > ENOENT: No such file or directory > at org.apache.hadoop.io.nativeio.NativeIO.chmod(Native Method) > at org.apache.hadoop.fs.FileUtil.execSetPermission(FileUtil.java:692) > at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:647) > at > org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:509) > at > org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:344) > at > org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:189) > at > org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:116) > at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:856) > at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:396) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093) > at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850) > at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:824) > at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:435) > at org.apache.hadoop.hive.ql.exec.ExecDriver.main(ExecDriver.java:693) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at org.apache.hadoop.util.RunJar.main(RunJar.java:156) > Job Submission failed with exception > 'org.apache.hadoop.io.nativeio.NativeIOException(No such file or directory)' > Execution failed with exit status: 2 > Obtaining error information > > Task failed! > Task ID: > Stage-1 > > Logs: > > /tmp/umroot/hive.log > FAILED: Execution Error, return code 2 from > org.apache.hadoop.hive.ql.exec.MapRedTask > > > I use hadoop-0.20.2 (single-node setup) and I have build Hive through the > latest source code. > > > Thank you in advance for your help, > > Mahsa > -- Swarnim