Hi, I am getting exception while running Mapreduce job hadoop jar hadoop-examples-1.0.4.jar wordcount /test/input/1.txt /test/output
*Exception Details:* 2013-02-13 12:20:16,449 INFO org.apache.hadoop.mapred.JobTracker: Adding tracker tracker_gaikwadt-ethz:localhost/127.0.0.1:35209 to host gaikwadt-ethz 2013-02-13 12:20:24,975 ERROR org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:gaikwadt cause:java.io.IOException: java.io.FileNotFoundException: In getFileStatus Filename: /job.xml Error: No such file or directory. 2013-02-13 12:20:24,977 INFO org.apache.hadoop.ipc.Server: IPC Server handler 0 on 9001, call submitJob(job_201302131220_0001, tcp://host=X.X.X.X,port=1XXXX/tmp/hadoop-gaikwadt/mapred/staging/gaikwadt/.staging/job_201302131220_0001, org.apache.hadoop.security.Credentials@18abe654) from 127.0.0.1:42227: error: java.io.IOException: java.io.FileNotFoundException: In getFileStatus Filename: /job.xml Error: No such file or directory. java.io.IOException: java.io.FileNotFoundException: In getFileStatus Filename: /job.xml Error: No such file or directory. at org.apache.hadoop.mapred.JobTracker.submitJob(JobTracker.java:3944) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:567) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1125) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382) Caused by: java.io.FileNotFoundException: In getFileStatus Filename: /job.xml Error: No such file or directory. at org.apache.hadoop.fs.ramcloud.RCOperations.getRCFileStatus(RCOperations.java:89) at org.apache.hadoop.fs.ramcloud.RCFileSystem.getFileStatus(RCFileSystem.java:193) at org.apache.hadoop.mapred.JobInProgress.<init>(JobInProgress.java:407) at org.apache.hadoop.mapred.JobTracker.submitJob(JobTracker.java:3942) ... 11 more *mapred-site.xml:* <configuration> <property> <name>mapred.job.tracker</name> <value>localhost:9001</value> </property> </configuration> My JobTracker and TaskTracker are running. *FileSystem:* I have configured a custom filesystem that performs all other operation without any problem. If I run same job on single node with this filesystem configuration, job runs properly. After debugging into my filesystem details, I found that while running mapred job, it writes a file * tcp://host=X.X.X.X,port=1XXXX/tmp/hadoop-gaikwadt/mapred/staging/gaikwadt/.staging/job_201302131220_0001/job.xml * But later JobInProgress looking for */job.xml*, which is obviously not present into filesystem. JobInProgress builds complete path using JobInfo. I checked source code and found that submitJob is implemented in LocalJobRunner.java and JobTracker.java. JobTracker sets path for job.xml in JobInfo which is been read by JobInProgress. Howevere that is not the case with LocalJobRunner.java, it only sets systemJobFile variable. Is there some configuration needed in order to work this properly? Let me know, if above explanation is not clear enough. Thanks in advance. Best Regards, Trupti Gaikwad.