Looks like you're using MapR FS. Have you considered posting this question on their mailing list ?
Cheers On Tue, Jun 2, 2015 at 11:14 PM, Shashi Vishwakarma < [email protected]> wrote: > Hi > > I have map reduce job for hbase bulk load. Job is converting data into > Hfiles and loading into hbase but after certain map % job is failing. Below > is the exception that I am getting. > > Error: java.io.FileNotFoundException: > > /var/mapr/local/tm4/mapred/nodeManager/spill/job_1433110149357_0005/attempt_1433110149357_0005_m_000000_0/spill83.out.index > at > org.apache.hadoop.fs.RawLocalFileSystem.open(RawLocalFileSystem.java:198) > at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:800) > at > > org.apache.hadoop.io.SecureIOUtils.openFSDataInputStream(SecureIOUtils.java:156) > at org.apache.hadoop.mapred.SpillRecord.<init>(SpillRecord.java:74) > at > > org.apache.hadoop.mapred.MapRFsOutputBuffer.mergeParts(MapRFsOutputBuffer.java:1382) > at > > org.apache.hadoop.mapred.MapRFsOutputBuffer.flush(MapRFsOutputBuffer.java:1627) > at > org.apache.hadoop.mapred.MapTask$NewOutputCollector.close(MapTask.java:709) > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:779) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:345) > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:415) > at > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1566) > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163) > > Above error says that file not found exception but I was able to locate > that particular spill on disk. > > Only thing that i noticed in job that for small set of data it is working > fine but as data grows job starts failing. > > Let me know if anyone has faced this issue. > > Thanks > > Shashi >
