Hadoop-Hdfs-trunk - Build # 748 - Still Failing
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/748/ ### ## LAST 60 LINES OF THE CONSOLE ### [...truncated 1467 lines...] [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:331: cannot find symbol [javac] symbol : class UnresolvedLinkException [javac] location: class org.apache.hadoop.fs.Hdfs [javac] throws IOException, UnresolvedLinkException { [javac] ^ [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:337: cannot find symbol [javac] symbol : class Path [javac] location: class org.apache.hadoop.fs.Hdfs [javac] public void renameInternal(Path src, Path dst) [javac] ^ [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:337: cannot find symbol [javac] symbol : class Path [javac] location: class org.apache.hadoop.fs.Hdfs [javac] public void renameInternal(Path src, Path dst) [javac]^ [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:338: cannot find symbol [javac] symbol : class UnresolvedLinkException [javac] location: class org.apache.hadoop.fs.Hdfs [javac] throws IOException, UnresolvedLinkException { [javac] ^ [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:343: cannot find symbol [javac] symbol : class Path [javac] location: class org.apache.hadoop.fs.Hdfs [javac] public void renameInternal(Path src, Path dst, boolean overwrite) [javac] ^ [javac] Note: Some input files use or override a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. [javac] 100 errors BUILD FAILED /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build.xml:370: Compile failed; see the compiler error output for details. Total time: 17 seconds == == STORE: saving artifacts == == mv: cannot stat `build/*.tar.gz': No such file or directory mv: cannot stat `build/*.jar': No such file or directory mv: cannot stat `build/test/findbugs': No such file or directory mv: cannot stat `build/docs/api': No such file or directory Build Failed [FINDBUGS] Skipping publisher since build result is FAILURE Archiving artifacts Publishing Clover coverage report... No Clover report will be published due to a Build Failure Recording test results Publishing Javadoc Recording fingerprints Updating HDFS-2237 Updating HDFS-2245 Updating HDFS-2241 Email was triggered for: Failure Sending email for trigger: Failure ### ## FAILED TESTS (if any) ## No tests ran.
[jira] [Resolved] (HDFS-90) dfs.name.dir disk full
[ https://issues.apache.org/jira/browse/HDFS-90?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Eli Collins resolved HDFS-90. - Resolution: Duplicate Dupe of HDFS-1594 > dfs.name.dir disk full > -- > > Key: HDFS-90 > URL: https://issues.apache.org/jira/browse/HDFS-90 > Project: Hadoop HDFS > Issue Type: Bug > Environment: ubuntu linux >Reporter: Torsten Curdt >Priority: Critical > > On our development cluster the namenode's disk filled up. This corrupted > both: the fsimage and edits file. This should never happen! > The files were truncated and I was able to fix them with a hex editor. Of > course some data must have been lost. > (see also possibly related https://issues.apache.org/jira/browse/HADOOP-2550 ) -- This message is automatically generated by JIRA. For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Resolved] (HDFS-218) name node should provide status of dfs.name.dir's
[ https://issues.apache.org/jira/browse/HDFS-218?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Eli Collins resolved HDFS-218. -- Resolution: Duplicate HDFS-1594 introduced a NN disk monitoring daemon. > name node should provide status of dfs.name.dir's > - > > Key: HDFS-218 > URL: https://issues.apache.org/jira/browse/HDFS-218 > Project: Hadoop HDFS > Issue Type: New Feature >Reporter: Allen Wittenauer >Assignee: Ravi Phulari > Attachments: HDFS-218-trunk.patch, HDFS-218.patch > > > We've had several reports of people letting their dfs.name.dir fill up. To > help prevent this, the name node web ui and perhaps dfsadmin -report or > another command should give a disk space report of all dfs.name.dir's as well > as whether or not the contents of that dir are actually being used, if the > copy is "good", last 2ndary name node update, and any thing else that might > be useful. -- This message is automatically generated by JIRA. For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Resolved] (HDFS-1176) Unsupported symbols in ClientProtocol.java (line 602)
[ https://issues.apache.org/jira/browse/HDFS-1176?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Eli Collins resolved HDFS-1176. --- Resolution: Fixed Fixed in HDFS-1096 > Unsupported symbols in ClientProtocol.java (line 602) > - > > Key: HDFS-1176 > URL: https://issues.apache.org/jira/browse/HDFS-1176 > Project: Hadoop HDFS > Issue Type: Bug >Reporter: Konstantin Boudnik > > JavaDoc of setSafeMode method contains illegal UTF-8 symbol -- This message is automatically generated by JIRA. For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Resolved] (HDFS-1048) org.apache.hadoop.security.AccessControlException should show full path on error
[ https://issues.apache.org/jira/browse/HDFS-1048?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Eli Collins resolved HDFS-1048. --- Resolution: Duplicate Fixed in HDFS-1628 > org.apache.hadoop.security.AccessControlException should show full path on > error > > > Key: HDFS-1048 > URL: https://issues.apache.org/jira/browse/HDFS-1048 > Project: Hadoop HDFS > Issue Type: Improvement >Affects Versions: 0.20.2, 0.21.0, 0.22.0 >Reporter: Allen Wittenauer > > The error message generated by > org.apache.hadoop.security.AccessControlException is somewhat useless in that > it only shows the first directory and not the full path. For example: > org.apache.hadoop.security.AccessControlException: Permission denied: > user=pymk, access=WRITE, inode="_join.temporary":hadoop:hadoop:rwxr-xr-x > Where is this mysterious _join.temporary directory? > If the full dir path was given, this error would be much more useful. -- This message is automatically generated by JIRA. For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Resolved] (HDFS-414) add fuse-dfs to src/contrib/build.xml test target
[ https://issues.apache.org/jira/browse/HDFS-414?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Eli Collins resolved HDFS-414. -- Resolution: Won't Fix Out of date > add fuse-dfs to src/contrib/build.xml test target > - > > Key: HDFS-414 > URL: https://issues.apache.org/jira/browse/HDFS-414 > Project: Hadoop HDFS > Issue Type: Test >Reporter: Pete Wyckoff >Assignee: Pete Wyckoff >Priority: Minor > Attachments: HADOOP-4644.txt > > > since contrib/build.xml test target now specifically includes contrib > projects rather than all, fuse-dfs needs to be added. > Note that fuse-dfs' test target is gated on -Dfusedfs=1 and -Dlibhdfs=1, so > just running ant test-contrib will not actually trigger it being run. -- This message is automatically generated by JIRA. For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Resolved] (HDFS-276) Checksums for Namenode image files
[ https://issues.apache.org/jira/browse/HDFS-276?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Eli Collins resolved HDFS-276. -- Resolution: Duplicate dupe of HDFS-903 > Checksums for Namenode image files > -- > > Key: HDFS-276 > URL: https://issues.apache.org/jira/browse/HDFS-276 > Project: Hadoop HDFS > Issue Type: Improvement >Reporter: Raghu Angadi > > Currently DFS can write multiple copies of image files but we do not > automatically recover from corrupted or truncated image files well. This jira > intends to keep CRC for image file records and Namenode should recover > accurate image as long as data exists in one of the copies (e.g. it should be > ok to have non overlapping corruptions in the copies). Will add more details > in next comment. -- This message is automatically generated by JIRA. For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Created] (HDFS-2246) Shortcut a local client reads to a Datanodes files directly
Shortcut a local client reads to a Datanodes files directly --- Key: HDFS-2246 URL: https://issues.apache.org/jira/browse/HDFS-2246 Project: Hadoop HDFS Issue Type: Improvement Reporter: Sanjay Radia -- This message is automatically generated by JIRA. For more information on JIRA, see: http://www.atlassian.com/software/jira
Hadoop-Hdfs-trunk-Commit - Build # 826 - Still Failing
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/826/ ### ## LAST 60 LINES OF THE CONSOLE ### [...truncated 2448 lines...] [junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.095 sec [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestOverReplicatedBlocks [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 4.116 sec [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestPendingReplication [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 7.217 sec [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicy [junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0 sec [junit] Test org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicy FAILED [junit] Running org.apache.hadoop.hdfs.server.datanode.TestBlockReplacement [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 21.559 sec [junit] Running org.apache.hadoop.hdfs.server.datanode.TestDirectoryScanner [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 4.282 sec [junit] Running org.apache.hadoop.hdfs.server.datanode.TestDiskError [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 9.604 sec [junit] Running org.apache.hadoop.hdfs.server.datanode.TestInterDatanodeProtocol [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 5.831 sec [junit] Running org.apache.hadoop.hdfs.server.datanode.TestSimulatedFSDataset [junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.684 sec [junit] Running org.apache.hadoop.hdfs.server.namenode.TestBackupNode [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 19.929 sec [junit] Running org.apache.hadoop.hdfs.server.namenode.TestCheckpoint [junit] Tests run: 27, Failures: 0, Errors: 1, Time elapsed: 72.463 sec [junit] Test org.apache.hadoop.hdfs.server.namenode.TestCheckpoint FAILED [junit] Running org.apache.hadoop.hdfs.server.namenode.TestEditLog [junit] Tests run: 13, Failures: 0, Errors: 0, Time elapsed: 23.348 sec [junit] Running org.apache.hadoop.hdfs.server.namenode.TestFileLimit [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 4.225 sec [junit] Running org.apache.hadoop.hdfs.server.namenode.TestNamenodeCapacityReport [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 2.766 sec [junit] Running org.apache.hadoop.hdfs.server.namenode.TestSafeMode [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 4.333 sec [junit] Running org.apache.hadoop.hdfs.server.namenode.TestStartup [junit] Tests run: 6, Failures: 0, Errors: 0, Time elapsed: 23.146 sec [junit] Running org.apache.hadoop.hdfs.server.namenode.TestStorageRestore [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 15.854 sec [junit] Running org.apache.hadoop.net.TestNetworkTopology [junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.115 sec [junit] Running org.apache.hadoop.security.TestPermission [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 5.841 sec checkfailure: [touch] Creating /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build/test/testsfailed BUILD FAILED /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:733: The following error occurred while executing this line: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:690: The following error occurred while executing this line: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:758: Tests failed! Total time: 11 minutes 22 seconds [FINDBUGS] Skipping publisher since build result is FAILURE Archiving artifacts Publishing Clover coverage report... No Clover report will be published due to a Build Failure Recording test results Publishing Javadoc Recording fingerprints Updating HDFS-2229 Email was triggered for: Failure Sending email for trigger: Failure ### ## FAILED TESTS (if any) ## 2 tests failed. FAILED: org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts Error Message: Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible. Stack Trace: org.apache.hadoop.hdfs.server.common.InconsistentFSStateException: Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible. at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$CheckpointStorage.recoverCreate(SecondaryNameNode.java:801) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:222) at org.apache.hadoop.
Hadoop-Hdfs-trunk-Commit - Build # 827 - Still Failing
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/827/ ### ## LAST 60 LINES OF THE CONSOLE ### [...truncated 2449 lines...] [junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.097 sec [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestOverReplicatedBlocks [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 4.059 sec [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestPendingReplication [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 7.22 sec [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicy [junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0 sec [junit] Test org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicy FAILED [junit] Running org.apache.hadoop.hdfs.server.datanode.TestBlockReplacement [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 21.53 sec [junit] Running org.apache.hadoop.hdfs.server.datanode.TestDirectoryScanner [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 4.345 sec [junit] Running org.apache.hadoop.hdfs.server.datanode.TestDiskError [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 9.393 sec [junit] Running org.apache.hadoop.hdfs.server.datanode.TestInterDatanodeProtocol [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 5.79 sec [junit] Running org.apache.hadoop.hdfs.server.datanode.TestSimulatedFSDataset [junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.749 sec [junit] Running org.apache.hadoop.hdfs.server.namenode.TestBackupNode [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 20.426 sec [junit] Running org.apache.hadoop.hdfs.server.namenode.TestCheckpoint [junit] Tests run: 27, Failures: 0, Errors: 1, Time elapsed: 75.523 sec [junit] Test org.apache.hadoop.hdfs.server.namenode.TestCheckpoint FAILED [junit] Running org.apache.hadoop.hdfs.server.namenode.TestEditLog [junit] Tests run: 13, Failures: 0, Errors: 0, Time elapsed: 22.78 sec [junit] Running org.apache.hadoop.hdfs.server.namenode.TestFileLimit [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 4.564 sec [junit] Running org.apache.hadoop.hdfs.server.namenode.TestNamenodeCapacityReport [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 2.809 sec [junit] Running org.apache.hadoop.hdfs.server.namenode.TestSafeMode [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 3.995 sec [junit] Running org.apache.hadoop.hdfs.server.namenode.TestStartup [junit] Tests run: 6, Failures: 0, Errors: 0, Time elapsed: 22.797 sec [junit] Running org.apache.hadoop.hdfs.server.namenode.TestStorageRestore [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 16.167 sec [junit] Running org.apache.hadoop.net.TestNetworkTopology [junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.118 sec [junit] Running org.apache.hadoop.security.TestPermission [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 5.935 sec checkfailure: [touch] Creating /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build/test/testsfailed BUILD FAILED /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:733: The following error occurred while executing this line: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:690: The following error occurred while executing this line: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:758: Tests failed! Total time: 11 minutes 46 seconds [FINDBUGS] Skipping publisher since build result is FAILURE Archiving artifacts Publishing Clover coverage report... No Clover report will be published due to a Build Failure Recording test results Publishing Javadoc Recording fingerprints Updating HADOOP-6158 Email was triggered for: Failure Sending email for trigger: Failure ### ## FAILED TESTS (if any) ## 2 tests failed. FAILED: org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts Error Message: Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible. Stack Trace: org.apache.hadoop.hdfs.server.common.InconsistentFSStateException: Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible. at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$CheckpointStorage.recoverCreate(SecondaryNameNode.java:801) at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:222) at org.apache.hadoop.hd
[jira] [Resolved] (HDFS-978) Record every new block allocation of a file into the transaction log.
[ https://issues.apache.org/jira/browse/HDFS-978?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Todd Lipcon resolved HDFS-978. -- Resolution: Duplicate Going to resolve this as duplicate of HDFS-1108, where there's an initial patch. > Record every new block allocation of a file into the transaction log. > - > > Key: HDFS-978 > URL: https://issues.apache.org/jira/browse/HDFS-978 > Project: Hadoop HDFS > Issue Type: Improvement > Components: name-node >Reporter: dhruba borthakur >Assignee: Todd Lipcon > > HDFS should record every new block allocation (of a file) into its > transaction logs. In the current code, block allocations are persisted only > when a file is closed or hflush-ed. This feature will enable HDFS writers to > survive namenode restarts. -- This message is automatically generated by JIRA. For more information on JIRA, see: http://www.atlassian.com/software/jira
Hadoop-Hdfs-trunk-Commit - Build # 828 - Still Failing
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/828/ ### ## LAST 60 LINES OF THE CONSOLE ### [...truncated 2450 lines...] [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestOverReplicatedBlocks [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 4.098 sec [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestPendingReplication [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 7.215 sec [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicy [junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0 sec [junit] Test org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicy FAILED [junit] Running org.apache.hadoop.hdfs.server.datanode.TestBlockReplacement [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 21.412 sec [junit] Running org.apache.hadoop.hdfs.server.datanode.TestDirectoryScanner [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 4.248 sec [junit] Running org.apache.hadoop.hdfs.server.datanode.TestDiskError [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 9.497 sec [junit] Running org.apache.hadoop.hdfs.server.datanode.TestInterDatanodeProtocol [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 5.791 sec [junit] Running org.apache.hadoop.hdfs.server.datanode.TestSimulatedFSDataset [junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.687 sec [junit] Running org.apache.hadoop.hdfs.server.namenode.TestBackupNode [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 20.013 sec [junit] Running org.apache.hadoop.hdfs.server.namenode.TestCheckpoint [junit] Tests run: 27, Failures: 0, Errors: 1, Time elapsed: 72.235 sec [junit] Test org.apache.hadoop.hdfs.server.namenode.TestCheckpoint FAILED [junit] Running org.apache.hadoop.hdfs.server.namenode.TestEditLog [junit] Tests run: 13, Failures: 0, Errors: 0, Time elapsed: 23.117 sec [junit] Running org.apache.hadoop.hdfs.server.namenode.TestFileLimit [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 4.466 sec [junit] Running org.apache.hadoop.hdfs.server.namenode.TestNamenodeCapacityReport [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 2.723 sec [junit] Running org.apache.hadoop.hdfs.server.namenode.TestSafeMode [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 4.152 sec [junit] Running org.apache.hadoop.hdfs.server.namenode.TestStartup [junit] Tests run: 6, Failures: 0, Errors: 0, Time elapsed: 22.995 sec [junit] Running org.apache.hadoop.hdfs.server.namenode.TestStorageRestore [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 15.869 sec [junit] Running org.apache.hadoop.net.TestNetworkTopology [junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.115 sec [junit] Running org.apache.hadoop.security.TestPermission [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 5.853 sec checkfailure: [touch] Creating /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build/test/testsfailed BUILD FAILED /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:733: The following error occurred while executing this line: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:690: The following error occurred while executing this line: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:758: Tests failed! Total time: 11 minutes 25 seconds [FINDBUGS] Skipping publisher since build result is FAILURE Archiving artifacts Publishing Clover coverage report... No Clover report will be published due to a Build Failure Recording test results Publishing Javadoc Recording fingerprints Error updating JIRA issues. Saving issues for next build. com.atlassian.jira.rpc.exception.RemotePermissionException: This issue does not exist or you don't have permission to view it. Email was triggered for: Failure Sending email for trigger: Failure ### ## FAILED TESTS (if any) ## 2 tests failed. FAILED: org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts Error Message: Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible. Stack Trace: org.apache.hadoop.hdfs.server.common.InconsistentFSStateException: Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible. at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$CheckpointStorage.recoverCreate(SecondaryNameNode.java:801) at org.apache.hadoop.hdfs.server.nam
Hadoop-Hdfs-trunk-Commit - Build # 829 - Still Failing
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/829/ ### ## LAST 60 LINES OF THE CONSOLE ### [...truncated 1273 lines...] [iajc] ^^^ [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:117 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil [iajc] final String path = ServletUtil.getDecodedPath(request, "/data"); [iajc] ^^^ [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:118 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil [iajc] final String encodedPath = ServletUtil.getRawPath(request, "/data"); [iajc] [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:90 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil [iajc] final String path = ServletUtil.getDecodedPath(request, "/listPaths"); [iajc] ^ [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:138 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil [iajc] final String filePath = ServletUtil.getDecodedPath(request, "/listPaths"); [iajc] ^ [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:65 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil [iajc] final String path = ServletUtil.getDecodedPath(request, "/streamFile"); [iajc] ^ [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:66 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil [iajc] final String rawPath = ServletUtil.getRawPath(request, "/streamFile"); [iajc]^ [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:67 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch] [iajc] [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:60 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch] [iajc] [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:50 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch] [iajc] [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:43 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch] [iajc] [iajc] [iajc] 18 errors, 4 warnings BUILD FAILED /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:222: The following error occurred while executing this line: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:203: The following error occurred while executing this line: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:90: compile errors: 18 Total time: 55 seconds == == STORE: saving artifacts == == mv: cannot stat `build/*.tar.gz': No such file or directory mv: cannot stat `build/test/findbugs': No such file or directory mv: cannot stat `build/docs/api': No such file or directory Build Failed [FINDBUGS] Skipping publisher since build result is FAILURE Archiving artifact
Re: Hadoop-Hdfs-trunk-Commit - Build # 829 - Still Failing
Tucu and co - does hdfs build the latest common or does it try to resolve against the latest deployed common artifact? Looks like hudson-test-patch doesn't pick up on the latest common build. On Thu, Aug 11, 2011 at 10:11 PM, Apache Jenkins Server wrote: > See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/829/ > > ### > ## LAST 60 LINES OF THE CONSOLE > ### > [...truncated 1273 lines...] > [iajc] ^^^ > [iajc] > /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:117 > [error] The method getDecodedPath(HttpServletRequest, String) is undefined > for the type ServletUtil > [iajc] final String path = ServletUtil.getDecodedPath(request, "/data"); > [iajc] ^^^ > [iajc] > /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:118 > [error] The method getRawPath(HttpServletRequest, String) is undefined for > the type ServletUtil > [iajc] final String encodedPath = ServletUtil.getRawPath(request, > "/data"); > [iajc] > [iajc] > /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:90 > [error] The method getDecodedPath(HttpServletRequest, String) is undefined > for the type ServletUtil > [iajc] final String path = ServletUtil.getDecodedPath(request, > "/listPaths"); > [iajc] ^ > [iajc] > /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:138 > [error] The method getDecodedPath(HttpServletRequest, String) is undefined > for the type ServletUtil > [iajc] final String filePath = ServletUtil.getDecodedPath(request, > "/listPaths"); > [iajc] ^ > [iajc] > /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:65 > [error] The method getDecodedPath(HttpServletRequest, String) is undefined > for the type ServletUtil > [iajc] final String path = ServletUtil.getDecodedPath(request, > "/streamFile"); > [iajc] ^ > [iajc] > /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:66 > [error] The method getRawPath(HttpServletRequest, String) is undefined for > the type ServletUtil > [iajc] final String rawPath = ServletUtil.getRawPath(request, > "/streamFile"); > [iajc] ^ > [iajc] > /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:67 > [warning] advice defined in > org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied > [Xlint:adviceDidNotMatch] > [iajc] > [iajc] > /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:60 > [warning] advice defined in > org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied > [Xlint:adviceDidNotMatch] > [iajc] > [iajc] > /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:50 > [warning] advice defined in > org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied > [Xlint:adviceDidNotMatch] > [iajc] > [iajc] > /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:43 > [warning] advice defined in > org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied > [Xlint:adviceDidNotMatch] > [iajc] > [iajc] > [iajc] 18 errors, 4 warnings > > BUILD FAILED > /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:222: > The following error occurred while executing this line: > /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:203: > The following error occurred while executing this line: > /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:90: > compile errors: 18 > > Total time: 55 seconds > > > == > == > STORE: saving artifacts > == > ==
[jira] [Created] (HDFS-2257) HftpFileSystem should implement getDelegationTokens
HftpFileSystem should implement getDelegationTokens --- Key: HDFS-2257 URL: https://issues.apache.org/jira/browse/HDFS-2257 Project: Hadoop HDFS Issue Type: Bug Reporter: Siddharth Seth -- This message is automatically generated by JIRA. For more information on JIRA, see: http://www.atlassian.com/software/jira
Re: Hadoop-Hdfs-trunk-Commit - Build # 829 - Still Failing
Eli, I think you are right, I'm pretty sure it is picking up the latest deployed snapshot. I'll discuss with Tom tomorrow morning how to take care of this (once HDFS is Mavenized we can easily build/use all latest bits from all modules, still some tricks not to run all modules test will have to be done). Thxs. Alejandro On Thu, Aug 11, 2011 at 10:20 PM, Eli Collins wrote: > Tucu and co - does hdfs build the latest common or does it try to > resolve against the latest deployed common artifact? > Looks like hudson-test-patch doesn't pick up on the latest common build. > > > > On Thu, Aug 11, 2011 at 10:11 PM, Apache Jenkins Server > wrote: > > See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/829/ > > > > > ### > > ## LAST 60 LINES OF THE CONSOLE > ### > > [...truncated 1273 lines...] > > [iajc] ^^^ > > [iajc] > /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:117 > [error] The method getDecodedPath(HttpServletRequest, String) is undefined > for the type ServletUtil > > [iajc] final String path = ServletUtil.getDecodedPath(request, > "/data"); > > [iajc] ^^^ > > [iajc] > /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:118 > [error] The method getRawPath(HttpServletRequest, String) is undefined for > the type ServletUtil > > [iajc] final String encodedPath = ServletUtil.getRawPath(request, > "/data"); > > [iajc] > > [iajc] > /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:90 > [error] The method getDecodedPath(HttpServletRequest, String) is undefined > for the type ServletUtil > > [iajc] final String path = ServletUtil.getDecodedPath(request, > "/listPaths"); > > [iajc] ^ > > [iajc] > /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:138 > [error] The method getDecodedPath(HttpServletRequest, String) is undefined > for the type ServletUtil > > [iajc] final String filePath = ServletUtil.getDecodedPath(request, > "/listPaths"); > > [iajc] ^ > > [iajc] > /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:65 > [error] The method getDecodedPath(HttpServletRequest, String) is undefined > for the type ServletUtil > > [iajc] final String path = ServletUtil.getDecodedPath(request, > "/streamFile"); > > [iajc] ^ > > [iajc] > /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:66 > [error] The method getRawPath(HttpServletRequest, String) is undefined for > the type ServletUtil > > [iajc] final String rawPath = ServletUtil.getRawPath(request, > "/streamFile"); > > [iajc]^ > > [iajc] > /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:67 > [warning] advice defined in > org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied > [Xlint:adviceDidNotMatch] > > [iajc] > > [iajc] > /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:60 > [warning] advice defined in > org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied > [Xlint:adviceDidNotMatch] > > [iajc] > > [iajc] > /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:50 > [warning] advice defined in > org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied > [Xlint:adviceDidNotMatch] > > [iajc] > > [iajc] > /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:43 > [warning] advice defined in > org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied > [Xlint:adviceDidNotMatch] > > [iajc] > > [iajc] > > [iajc] 18 errors, 4 warnings > > > > BUILD FAILED > > > /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:222: > The following error occurred while executing this line: > > > /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:203: > The following error occ
Hadoop-Hdfs-trunk-Commit - Build # 830 - Still Failing
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/830/ ### ## LAST 60 LINES OF THE CONSOLE ### [...truncated 1274 lines...] [iajc] ^^^ [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:117 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil [iajc] final String path = ServletUtil.getDecodedPath(request, "/data"); [iajc] ^^^ [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:118 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil [iajc] final String encodedPath = ServletUtil.getRawPath(request, "/data"); [iajc] [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:90 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil [iajc] final String path = ServletUtil.getDecodedPath(request, "/listPaths"); [iajc] ^ [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:138 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil [iajc] final String filePath = ServletUtil.getDecodedPath(request, "/listPaths"); [iajc] ^ [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:65 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil [iajc] final String path = ServletUtil.getDecodedPath(request, "/streamFile"); [iajc] ^ [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:66 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil [iajc] final String rawPath = ServletUtil.getRawPath(request, "/streamFile"); [iajc]^ [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:67 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch] [iajc] [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:60 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch] [iajc] [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:50 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch] [iajc] [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:43 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch] [iajc] [iajc] [iajc] 18 errors, 4 warnings BUILD FAILED /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:222: The following error occurred while executing this line: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:203: The following error occurred while executing this line: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:90: compile errors: 18 Total time: 1 minute 2 seconds == == STORE: saving artifacts == == mv: cannot stat `build/*.tar.gz': No such file or directory mv: cannot stat `build/test/findbugs': No such file or directory mv: cannot stat `build/docs/api': No such file or directory Build Failed [FINDBUGS] Skipping publisher since build result is FAILURE Archiving
Hadoop-Hdfs-trunk-Commit - Build # 831 - Still Failing
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/831/ ### ## LAST 60 LINES OF THE CONSOLE ### [...truncated 1274 lines...] [iajc] ^^^ [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:117 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil [iajc] final String path = ServletUtil.getDecodedPath(request, "/data"); [iajc] ^^^ [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:118 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil [iajc] final String encodedPath = ServletUtil.getRawPath(request, "/data"); [iajc] [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:90 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil [iajc] final String path = ServletUtil.getDecodedPath(request, "/listPaths"); [iajc] ^ [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:138 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil [iajc] final String filePath = ServletUtil.getDecodedPath(request, "/listPaths"); [iajc] ^ [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:65 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil [iajc] final String path = ServletUtil.getDecodedPath(request, "/streamFile"); [iajc] ^ [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:66 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil [iajc] final String rawPath = ServletUtil.getRawPath(request, "/streamFile"); [iajc]^ [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:67 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch] [iajc] [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:60 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch] [iajc] [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:50 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch] [iajc] [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:43 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch] [iajc] [iajc] [iajc] 18 errors, 4 warnings BUILD FAILED /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:222: The following error occurred while executing this line: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:203: The following error occurred while executing this line: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:90: compile errors: 18 Total time: 55 seconds == == STORE: saving artifacts == == mv: cannot stat `build/*.tar.gz': No such file or directory mv: cannot stat `build/test/findbugs': No such file or directory mv: cannot stat `build/docs/api': No such file or directory Build Failed [FINDBUGS] Skipping publisher since build result is FAILURE Archiving artifact
Hadoop-Hdfs-trunk-Commit - Build # 832 - Still Failing
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/832/ ### ## LAST 60 LINES OF THE CONSOLE ### [...truncated 1273 lines...] [iajc] ^^^ [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:117 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil [iajc] final String path = ServletUtil.getDecodedPath(request, "/data"); [iajc] ^^^ [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:118 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil [iajc] final String encodedPath = ServletUtil.getRawPath(request, "/data"); [iajc] [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:90 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil [iajc] final String path = ServletUtil.getDecodedPath(request, "/listPaths"); [iajc] ^ [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:138 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil [iajc] final String filePath = ServletUtil.getDecodedPath(request, "/listPaths"); [iajc] ^ [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:65 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil [iajc] final String path = ServletUtil.getDecodedPath(request, "/streamFile"); [iajc] ^ [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:66 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil [iajc] final String rawPath = ServletUtil.getRawPath(request, "/streamFile"); [iajc]^ [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:67 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch] [iajc] [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:60 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch] [iajc] [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:50 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch] [iajc] [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:43 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch] [iajc] [iajc] [iajc] 18 errors, 4 warnings BUILD FAILED /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:222: The following error occurred while executing this line: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:203: The following error occurred while executing this line: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:90: compile errors: 18 Total time: 58 seconds == == STORE: saving artifacts == == mv: cannot stat `build/*.tar.gz': No such file or directory mv: cannot stat `build/test/findbugs': No such file or directory mv: cannot stat `build/docs/api': No such file or directory Build Failed [FINDBUGS] Skipping publisher since build result is FAILURE Archiving artifact