Jenkins build is still unstable: Hadoop-Hdfs-0.23-Build #20

2011-09-25 Thread Apache Jenkins Server
See 




Hadoop-Hdfs-0.23-Build - Build # 20 - Still Unstable

2011-09-25 Thread Apache Jenkins Server
See https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/20/

###
## LAST 60 LINES OF THE CONSOLE 
###
[...truncated 9501 lines...]
[INFO] --- maven-antrun-plugin:1.6:run (tar) @ hadoop-hdfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs 
---
[INFO] 
[INFO] There are 9008 checkstyle errors.
[WARNING] Unable to locate Source XRef to link to - DISABLED
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs ---
[INFO] ** FindBugsMojo execute ***
[INFO] canGenerate is true
[INFO] ** FindBugsMojo executeFindbugs ***
[INFO] Temp File is 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/findbugsTemp.xml
[INFO] Fork Value is true
[INFO] xmlOutput is false
[INFO] 
[INFO] 
[INFO] Building Apache Hadoop HDFS Project 0.23.0-SNAPSHOT
[INFO] 
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project 
---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ 
hadoop-hdfs-project ---
[INFO] ** FindBugsMojo execute ***
[INFO] canGenerate is false
[INFO] 
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS  SUCCESS [3:29.844s]
[INFO] Apache Hadoop HDFS Project  SUCCESS [0.059s]
[INFO] 
[INFO] BUILD SUCCESS
[INFO] 
[INFO] Total time: 3:30.344s
[INFO] Finished at: Sun Sep 25 11:37:10 UTC 2011
[INFO] Final Memory: 58M/752M
[INFO] 
+ /home/jenkins/tools/maven/latest/bin/mvn test 
-Dmaven.test.failure.ignore=true -Pclover 
-DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Publishing Clover coverage report...
Publishing Clover HTML report...
Publishing Clover XML report...
Publishing Clover coverage results...
Recording test results
Build step 'Publish JUnit test result report' changed build result to UNSTABLE
Publishing Javadoc
Recording fingerprints
Updating MAPREDUCE-2961
Updating MAPREDUCE-3053
Updating HDFS-2290
Updating HADOOP-7663
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Unstable
Sending email for trigger: Unstable



###
## FAILED TESTS (if any) 
##
6 tests failed.
REGRESSION:  org.apache.hadoop.hdfs.TestDFSRollback.testRollback

Error Message:
File contents differed:   
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/test/data/dfs/data2/current/VERSION=06d38cf801314a6797f0a99b7e875e4e
   
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/test/data/dfs/data1/current/VERSION=8e2edff03238f0bb1febac9b10bef99e

Stack Trace:
java.lang.AssertionError: File contents differed:
  
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/test/data/dfs/data2/current/VERSION=06d38cf801314a6797f0a99b7e875e4e
  
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/test/data/dfs/data1/current/VERSION=8e2edff03238f0bb1febac9b10bef99e
at org.junit.Assert.fail(Assert.java:91)
at 
org.apache.hadoop.hdfs.server.namenode.FSImageTestUtil.assertFileContentsSame(FSImageTestUtil.java:250)
at 
org.apache.hadoop.hdfs.server.namenode.FSImageTestUtil.assertParallelFilesAreIdentical(FSImageTestUtil.java:236)
at 
org.apache.hadoop.hdfs.TestDFSRollback.checkResult(TestDFSRollback.java:86)
at 
org.apache.hadoop.hdfs.TestDFSRollback.__CLR3_0_27oj5yb11be(TestDFSRollback.java:171)
at 
org.apache.hadoop.hdfs.TestDFSRollback.testRollback(TestDFSRollback.java:134)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at junit.framework.TestCase.runTest(TestCase.java:168)
at junit.framework.

Jenkins build is still unstable: Hadoop-Hdfs-trunk #811

2011-09-25 Thread Apache Jenkins Server
See 




Hadoop-Hdfs-trunk - Build # 811 - Still Unstable

2011-09-25 Thread Apache Jenkins Server
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/811/

###
## LAST 60 LINES OF THE CONSOLE 
###
[...truncated 9867 lines...]
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs 
---
[INFO] 
[INFO] There are 9348 checkstyle errors.
[WARNING] Unable to locate Source XRef to link to - DISABLED
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs ---
[INFO] ** FindBugsMojo execute ***
[INFO] canGenerate is true
[INFO] ** FindBugsMojo executeFindbugs ***
[INFO] Temp File is 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/findbugsTemp.xml
[INFO] Fork Value is true
[INFO] xmlOutput is false
[INFO] 
[INFO] 
[INFO] Building Apache Hadoop HDFS Project 0.24.0-SNAPSHOT
[INFO] 
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project 
---
[INFO] Deleting 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project 
---
[INFO] Executing tasks

main:
[mkdir] Created dir: 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ 
hadoop-hdfs-project ---
[INFO] ** FindBugsMojo execute ***
[INFO] canGenerate is false
[INFO] 
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS  SUCCESS [3:30.381s]
[INFO] Apache Hadoop HDFS Project  SUCCESS [0.087s]
[INFO] 
[INFO] BUILD SUCCESS
[INFO] 
[INFO] Total time: 3:30.864s
[INFO] Finished at: Sun Sep 25 11:36:55 UTC 2011
[INFO] Final Memory: 59M/762M
[INFO] 
+ /home/jenkins/tools/maven/latest/bin/mvn test 
-Dmaven.test.failure.ignore=true -Pclover 
-DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Recording test results
Build step 'Publish JUnit test result report' changed build result to UNSTABLE
Publishing Javadoc
Recording fingerprints
Updating MAPREDUCE-2691
Updating MAPREDUCE-2990
Updating MAPREDUCE-3053
Updating HDFS-2290
Updating HADOOP-7663
Updating HADOOP-7457
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Unstable
Sending email for trigger: Unstable



###
## FAILED TESTS (if any) 
##
4 tests failed.
FAILED:  org.apache.hadoop.hdfs.TestDfsOverAvroRpc.testWorkingDirectory

Error Message:
Two methods with same name: delete

Stack Trace:
org.apache.avro.AvroTypeException: Two methods with same name: delete
at org.apache.avro.reflect.ReflectData.getProtocol(ReflectData.java:394)
at 
org.apache.avro.ipc.reflect.ReflectResponder.(ReflectResponder.java:36)
at 
org.apache.hadoop.ipc.AvroRpcEngine.createResponder(AvroRpcEngine.java:180)
at 
org.apache.hadoop.ipc.AvroRpcEngine$TunnelResponder.(AvroRpcEngine.java:187)
at org.apache.hadoop.ipc.AvroRpcEngine.getServer(AvroRpcEngine.java:223)
at org.apache.hadoop.ipc.RPC.getServer(RPC.java:570)
at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.(NameNodeRpcServer.java:146)
at 
org.apache.hadoop.hdfs.server.namenode.NameNode.createRpcServer(NameNode.java:355)
at 
org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:333)
at 
org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:457)
at 
org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:449)
at 
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:747)
at 
org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:637)
at 
org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:541)
at org.apache.hadoop.hdfs.MiniDFSCluster.(MiniDFSCluster.java:257)
at org.apache.hadoop.hdfs.MiniDFSCluster.(MiniDFSCluster.java:85)
at 
org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:243)
at 
org.apache.hadoop.hdfs.TestLocalDFS.__CLR3_0_2hl

Hadoop-Hdfs-22-branch - Build # 89 - Still Failing

2011-09-25 Thread Apache Jenkins Server
See https://builds.apache.org/job/Hadoop-Hdfs-22-branch/89/

###
## LAST 60 LINES OF THE CONSOLE 
###
[...truncated 3123 lines...]
compile-hdfs-test:
   [delete] Deleting directory 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
[mkdir] Created dir: 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
 [copy] Copying 1 file to 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
 [copy] Copying 1 file to 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
 [copy] Copying 1 file to 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
 [copy] Copying 1 file to 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
 [copy] Copying 1 file to 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
 [copy] Copying 1 file to 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
 [copy] Copying 1 file to 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
 [copy] Copying 1 file to 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
 [copy] Copying 1 file to 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
 [copy] Copying 1 file to 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache

run-test-hdfs-excluding-commit-and-smoke:
[mkdir] Created dir: 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/data
[mkdir] Created dir: 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/logs
 [copy] Copying 1 file to 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/extraconf
 [copy] Copying 1 file to 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/extraconf
[junit] WARNING: multiple versions of ant detected in path for junit 
[junit]  
jar:file:/home/jenkins/tools/ant/latest/lib/ant.jar!/org/apache/tools/ant/Project.class
[junit]  and 
jar:file:/home/jenkins/.ivy2/cache/ant/ant/jars/ant-1.6.5.jar!/org/apache/tools/ant/Project.class
[junit] Running org.apache.hadoop.fs.TestFiListPath
[junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 1.934 sec
[junit] Running org.apache.hadoop.fs.TestFiRename
[junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 67.874 sec
[junit] Running org.apache.hadoop.hdfs.TestFiHFlush
[junit] Tests run: 9, Failures: 0, Errors: 0, Time elapsed: 28.269 sec
[junit] Running org.apache.hadoop.hdfs.TestFiHftp
[junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 36.315 sec
[junit] Running org.apache.hadoop.hdfs.TestFiPipelines
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 4.706 sec
[junit] Running 
org.apache.hadoop.hdfs.server.datanode.TestFiDataTransferProtocol
[junit] Tests run: 29, Failures: 0, Errors: 0, Time elapsed: 206.194 sec
[junit] Running 
org.apache.hadoop.hdfs.server.datanode.TestFiDataTransferProtocol2
[junit] Tests run: 10, Failures: 0, Errors: 0, Time elapsed: 294.643 sec
[junit] Running org.apache.hadoop.hdfs.server.datanode.TestFiPipelineClose
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 60.781 sec

checkfailure:

-run-test-hdfs-fault-inject-withtestcaseonly:

run-test-hdfs-fault-inject:

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build.xml:761:
 Tests failed!

Total time: 203 minutes 31 seconds
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Email was triggered for: Failure
Sending email for trigger: Failure



###
## FAILED TESTS (if any) 
##
5 tests failed.
REGRESSION:  org.apache.hadoop.hdfs.TestCrcCorruption.testCrcCorruption

Error Message:
Timeout occurred. Please note the time in the report does not reflect the time 
until the timeout.

Stack Trace:
junit.framework.AssertionFailedError: Timeout occurred. Please note the time in 
the report does not reflect the time until the timeout.


REGRESSION:  org.apache.hadoop.hdfs.TestParallelRead.testParallelRead

Error Message:
Timeout occurred. Please note the time in the report does not reflect the time 
until the timeout.

Stack Trace:
junit.framework.AssertionFailedError: Timeout occurr

[jira] [Resolved] (HDFS-2325) Fuse-DFS fails to build on Hadoop 20.203.0

2011-09-25 Thread Matt Foley (JIRA)

 [ 
https://issues.apache.org/jira/browse/HDFS-2325?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt Foley resolved HDFS-2325.
--

   Resolution: Fixed
Fix Version/s: 0.20.205.0

> Fuse-DFS fails to build on Hadoop 20.203.0
> --
>
> Key: HDFS-2325
> URL: https://issues.apache.org/jira/browse/HDFS-2325
> Project: Hadoop HDFS
>  Issue Type: Bug
>  Components: contrib/fuse-dfs, libhdfs
>Affects Versions: 0.20.203.0, 0.20.205.0
> Environment: Ubuntu 11.04, Linux 2.6.38-11-generic x86_64
>Reporter: Charles Earl
>Assignee: Kihwal Lee
>Priority: Blocker
>  Labels: hadoop, newbie
> Fix For: 0.20.205.0
>
> Attachments: hdfs-2325-branch-0.20-security.patch
>
>   Original Estimate: 10m
>  Remaining Estimate: 10m
>
> In building fuse-dfs, the compile fails due to an argument mismatch between 
> call to hdfsConnectAsUser on line 40 of 
> src/contrib/fuse-dfs/src/fuse_connect.c and an earlier definition of 
> hdfsConnectAsUser given in src/c++/libhdfs/hdfs.h.
> I suggest changing hdfs.h. I made the following change in hdfs.h in my local 
> copy:
> 106c106,107
> <  hdfsFS hdfsConnectAsUser(const char* host, tPort port, const char 
> *user);
> ---
> >   // hdfsFS hdfsConnectAsUser(const char* host, tPort port, const char 
> > *user);
> >   hdfsFS hdfsConnectAsUser(const char* host, tPort port, const char *user, 
> > const char** groups, int numgroups);
> This new version successfully compiles.

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] [Created] (HDFS-2362) More Improvements on NameNode Scalability

2011-09-25 Thread Hairong Kuang (JIRA)
More Improvements on NameNode Scalability
-

 Key: HDFS-2362
 URL: https://issues.apache.org/jira/browse/HDFS-2362
 Project: Hadoop HDFS
  Issue Type: Improvement
  Components: name-node
Reporter: Hairong Kuang


This jira acts as an umbrella jira to track all the improvements we've done 
recently to improve Namenode's performance, responsiveness, and hence 
scalability. Those improvements include:
1. Incremental block reports (HDFS-395)
2. Upgradable lock to allow simutaleous read operation while reportDiff is in 
progress in processing block reports
3. More CPU efficient data structure for 
under-replicated/over-replicated/invalidate blocks
4. Increase granularity of write operations in ReplicationMonitor thus reducing 
contention for write lock
5. Support variable block sizes
6. Release RPC handlers while waiting for edit log is synced to disk
7. Reduce network traffic pressure to the master rack where NN is located by 
lowering read priority of the replicas on the rack
8. A standalone KeepAlive heartbeat thread
9. Reduce Multiple traversals of path directory to one for most namespace 
manipulations
10. Move logging out of write lock section.


--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira