See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1399/changes>

Changes:

[suresh] HADOOP-9563. Fix incompatibility introduced by HADOOP-9523. 
Contributed by Tian Hong Wang.

[szetszwo] HDFS-3180. Add socket timeouts to WebHdfsFileSystem.  Contributed by 
Chris Nauroth

[szetszwo] HDFS-4813. Add volatile to BlocksMap.blocks so that the replication 
thread can see the updated value.  Contributed by Jing Zhao

[vinodkv] MAPREDUCE-5232. Add a configuration to be able to log classpath and 
other system properties on mapreduce JVMs startup. Contributed by Sangjin Lee.

[tucu] MAPREDUCE-5244. Two functions changed their visibility in JobStatus. 
(zjshen via tucu)

[todd] HADOOP-9220. Unnecessary transition to standby in ActiveStandbyElector. 
Contributed by Tom White and Todd Lipcon.

[todd] HADOOP-9307. BufferedFSInputStream.read returns wrong results after 
certain seeks. Contributed by Todd Lipcon.

[suresh] HADOOP-9560. metrics2#JvmMetrics should have max memory size of JVM. 
Contributed by Tsuyoshi Ozawa.

------------------------------------------
[...truncated 14298 lines...]
        at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
        at org.junit.runners.Suite.runChild(Suite.java:128)
        at org.junit.runners.Suite.runChild(Suite.java:24)
        at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
        at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
        at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
        at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
        at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
        at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
        at 
org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:252)
        at 
org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:141)
        at 
org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:112)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at 
org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:189)
        at 
org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:165)
        at 
org.apache.maven.surefire.booter.ProviderFactory.invokeProvider(ProviderFactory.java:85)
        at 
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:115)
        at 
org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:75)

Running org.apache.hadoop.fs.http.client.TestHttpFSWithHttpFSFileSystem
Tests run: 32, Failures: 0, Errors: 2, Skipped: 0, Time elapsed: 19.593 sec <<< 
FAILURE!
testOperation[4](org.apache.hadoop.fs.http.client.TestHttpFSWithHttpFSFileSystem)
  Time elapsed: 390 sec  <<< ERROR!
org.apache.hadoop.ipc.RemoteException(java.io.IOException): Specified block 
size is less than configured minimum value 
(dfs.namenode.fs-limits.min-block-size): 1024 < 1048576
        at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:1849)
        at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1818)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:440)
        at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:300)
        at 
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:47995)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:527)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1033)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1842)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1838)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1489)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1836)

        at org.apache.hadoop.ipc.Client.call(Client.java:1303)
        at org.apache.hadoop.ipc.Client.call(Client.java:1255)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:204)
        at $Proxy19.create(Unknown Source)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:163)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:82)
        at $Proxy19.create(Unknown Source)
        at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:227)
        at 
org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1345)
        at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1285)
        at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1210)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:298)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:266)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:83)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:888)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:869)
        at org.apache.hadoop.hdfs.DFSTestUtil.createFile(DFSTestUtil.java:233)
        at org.apache.hadoop.hdfs.DFSTestUtil.createFile(DFSTestUtil.java:219)
        at 
org.apache.hadoop.fs.http.client.BaseTestHttpFSWith.testConcat(BaseTestHttpFSWith.java:220)
        at 
org.apache.hadoop.fs.http.client.BaseTestHttpFSWith.operation(BaseTestHttpFSWith.java:498)
        at 
org.apache.hadoop.fs.http.client.BaseTestHttpFSWith.testOperation(BaseTestHttpFSWith.java:558)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
        at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
        at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
        at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
        at 
org.apache.hadoop.test.TestHdfsHelper$HdfsStatement.evaluate(TestHdfsHelper.java:74)
        at 
org.apache.hadoop.test.TestDirHelper$1.evaluate(TestDirHelper.java:106)
        at 
org.apache.hadoop.test.TestDirHelper$1.evaluate(TestDirHelper.java:106)
        at 
org.apache.hadoop.test.TestJettyHelper$1.evaluate(TestJettyHelper.java:53)
        at 
org.apache.hadoop.test.TestExceptionHelper$1.evaluate(TestExceptionHelper.java:42)
        at 
org.junit.runners.BlockJUnit4ClassRunner.runNotIgnored(BlockJUnit4ClassRunner.java:79)
        at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:71)
        at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:49)
        at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
        at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
        at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
        at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
        at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
        at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
        at org.junit.runners.Suite.runChild(Suite.java:128)
        at org.junit.runners.Suite.runChild(Suite.java:24)
        at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
        at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
        at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
        at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
        at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
        at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
        at 
org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:252)
        at 
org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:141)
        at 
org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:112)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at 
org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:189)
        at 
org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:165)
        at 
org.apache.maven.surefire.booter.ProviderFactory.invokeProvider(ProviderFactory.java:85)
        at 
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:115)
        at 
org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:75)

testOperationDoAs[4](org.apache.hadoop.fs.http.client.TestHttpFSWithHttpFSFileSystem)
  Time elapsed: 400 sec  <<< ERROR!
org.apache.hadoop.ipc.RemoteException(java.io.IOException): Specified block 
size is less than configured minimum value 
(dfs.namenode.fs-limits.min-block-size): 1024 < 1048576
        at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:1849)
        at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1818)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:440)
        at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:300)
        at 
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:47995)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:527)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1033)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1842)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1838)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1489)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1836)

        at org.apache.hadoop.ipc.Client.call(Client.java:1303)
        at org.apache.hadoop.ipc.Client.call(Client.java:1255)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:204)
        at $Proxy19.create(Unknown Source)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:163)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:82)
        at $Proxy19.create(Unknown Source)
        at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:227)
        at 
org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1345)
        at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1285)
        at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1210)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:298)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:266)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:83)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:888)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:869)
        at org.apache.hadoop.hdfs.DFSTestUtil.createFile(DFSTestUtil.java:233)
        at org.apache.hadoop.hdfs.DFSTestUtil.createFile(DFSTestUtil.java:219)
        at 
org.apache.hadoop.fs.http.client.BaseTestHttpFSWith.testConcat(BaseTestHttpFSWith.java:220)
        at 
org.apache.hadoop.fs.http.client.BaseTestHttpFSWith.operation(BaseTestHttpFSWith.java:498)
        at 
org.apache.hadoop.fs.http.client.BaseTestHttpFSWith.access$100(BaseTestHttpFSWith.java:62)
        at 
org.apache.hadoop.fs.http.client.BaseTestHttpFSWith$1.run(BaseTestHttpFSWith.java:572)
        at 
org.apache.hadoop.fs.http.client.BaseTestHttpFSWith$1.run(BaseTestHttpFSWith.java:569)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1489)
        at 
org.apache.hadoop.fs.http.client.BaseTestHttpFSWith.testOperationDoAs(BaseTestHttpFSWith.java:569)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
        at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
        at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
        at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
        at 
org.apache.hadoop.test.TestHdfsHelper$HdfsStatement.evaluate(TestHdfsHelper.java:74)
        at 
org.apache.hadoop.test.TestDirHelper$1.evaluate(TestDirHelper.java:106)
        at 
org.apache.hadoop.test.TestDirHelper$1.evaluate(TestDirHelper.java:106)
        at 
org.apache.hadoop.test.TestJettyHelper$1.evaluate(TestJettyHelper.java:53)
        at 
org.apache.hadoop.test.TestExceptionHelper$1.evaluate(TestExceptionHelper.java:42)
        at 
org.junit.runners.BlockJUnit4ClassRunner.runNotIgnored(BlockJUnit4ClassRunner.java:79)
        at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:71)
        at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:49)
        at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
        at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
        at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
        at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
        at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
        at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
        at org.junit.runners.Suite.runChild(Suite.java:128)
        at org.junit.runners.Suite.runChild(Suite.java:24)
        at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
        at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
        at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
        at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
        at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
        at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
        at 
org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:252)
        at 
org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:141)
        at 
org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:112)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at 
org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:189)
        at 
org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:165)
        at 
org.apache.maven.surefire.booter.ProviderFactory.invokeProvider(ProviderFactory.java:85)
        at 
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:115)
        at 
org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:75)

Running org.apache.hadoop.fs.http.client.TestHttpFSFileSystemLocalFileSystem
Tests run: 32, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.436 sec

Results :

Tests in error: 
  
testOperation[4](org.apache.hadoop.fs.http.client.TestHttpFSFWithWebhdfsFileSystem):
 Specified block size is less than configured minimum value 
(dfs.namenode.fs-limits.min-block-size): 1024 < 1048576(..)
  
testOperationDoAs[4](org.apache.hadoop.fs.http.client.TestHttpFSFWithWebhdfsFileSystem):
 Specified block size is less than configured minimum value 
(dfs.namenode.fs-limits.min-block-size): 1024 < 1048576(..)
  
testOperation[4](org.apache.hadoop.fs.http.client.TestHttpFSWithHttpFSFileSystem):
 Specified block size is less than configured minimum value 
(dfs.namenode.fs-limits.min-block-size): 1024 < 1048576(..)
  
testOperationDoAs[4](org.apache.hadoop.fs.http.client.TestHttpFSWithHttpFSFileSystem):
 Specified block size is less than configured minimum value 
(dfs.namenode.fs-limits.min-block-size): 1024 < 1048576(..)

Tests run: 286, Failures: 0, Errors: 4, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ SUCCESS 
[1:36:06.601s]
[INFO] Apache Hadoop HttpFS .............................. FAILURE [1:44.873s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:37:52.241s
[INFO] Finished at: Wed May 15 13:11:13 UTC 2013
[INFO] Final Memory: 51M/767M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-surefire-plugin:2.12.3:test (default-test) on 
project hadoop-hdfs-httpfs: There are test failures.
[ERROR] 
[ERROR] Please refer to 
<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/surefire-reports>
 for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs-httpfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Updating HADOOP-9307
Updating HADOOP-9220
Updating HADOOP-9563
Updating HADOOP-9560
Updating MAPREDUCE-5232
Updating MAPREDUCE-5244
Updating HADOOP-9523
Updating HDFS-3180
Updating HDFS-4813

Reply via email to