Apache Hadoop qbt Report: branch-2.10+JDK7 on Linux/x86_64
For more details, see https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/602/ No changes -1 overall The following subsystems voted -1: asflicense hadolint mvnsite pathlen unit The following subsystems voted -1 but were configured to be filtered/ignored: cc checkstyle javac javadoc pylint shellcheck whitespace The following subsystems are considered long running: (runtime bigger than 1h 0m 0s) unit Specific tests: Failed junit tests : hadoop.fs.TestFileUtil hadoop.io.compress.snappy.TestSnappyCompressorDecompressor hadoop.hdfs.server.blockmanagement.TestReplicationPolicyWithUpgradeDomain hadoop.hdfs.qjournal.server.TestJournalNodeRespectsBindHostKeys hadoop.contrib.bkjournal.TestBookKeeperHACheckpoints hadoop.contrib.bkjournal.TestBookKeeperHACheckpoints hadoop.hdfs.server.federation.resolver.TestMultipleDestinationResolver hadoop.hdfs.server.federation.router.TestRouterNamenodeHeartbeat hadoop.hdfs.server.federation.router.TestRouterQuota hadoop.hdfs.server.federation.resolver.order.TestLocalResolver hadoop.yarn.server.resourcemanager.TestClientRMService hadoop.yarn.server.resourcemanager.monitor.invariants.TestMetricsInvariantChecker hadoop.mapreduce.jobhistory.TestHistoryViewerPrinter hadoop.mapreduce.lib.input.TestLineRecordReader hadoop.mapred.TestLineRecordReader hadoop.yarn.sls.TestSLSRunner hadoop.resourceestimator.service.TestResourceEstimatorService hadoop.resourceestimator.solver.impl.TestLpSolver cc: https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/602/artifact/out/diff-compile-cc-root.txt [4.0K] javac: https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/602/artifact/out/diff-compile-javac-root.txt [476K] checkstyle: https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/602/artifact/out/diff-checkstyle-root.txt [14M] hadolint: https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/602/artifact/out/diff-patch-hadolint.txt [4.0K] mvnsite: https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/602/artifact/out/patch-mvnsite-root.txt [560K] pathlen: https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/602/artifact/out/pathlen.txt [12K] pylint: https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/602/artifact/out/diff-patch-pylint.txt [20K] shellcheck: https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/602/artifact/out/diff-patch-shellcheck.txt [72K] whitespace: https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/602/artifact/out/whitespace-eol.txt [12M] https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/602/artifact/out/whitespace-tabs.txt [1.3M] javadoc: https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/602/artifact/out/patch-javadoc-root.txt [40K] unit: https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/602/artifact/out/patch-unit-hadoop-common-project_hadoop-common.txt [224K] https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/602/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt [424K] https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/602/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs_src_contrib_bkjournal.txt [12K] https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/602/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs-rbf.txt [36K] https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/602/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt [20K] https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/602/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-resourcemanager.txt [128K] https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/602/artifact/out/patch-unit-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-core.txt [104K] https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/602/artifact/out/patch-unit-hadoop-tools_hadoop-azure.txt [20K] https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/602/artifact/out/patch-unit-hadoop-tools_hadoop-sls.txt [28K] https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/602/artifact/out/patch-unit-hadoop-tools_hadoop-resourceestimator.txt [16K] asflicense: https://ci-hadoop.apache.org/job/hadoop-qbt
Re: [VOTE] Release Apache Hadoop 3.2.3 - RC0
+1 (non-binding) Using hadoop-vote.sh * Signature: ok * Checksum : ok * Rat check (1.8.0_301): ok - mvn clean apache-rat:check * Built from source (1.8.0_301): ok - mvn clean install -DskipTests * Built tar from source (1.8.0_301): ok - mvn clean package -Pdist -DskipTests -Dtar -Dmaven.javadoc.skip=true Tested HDFS on pseudo-distributed mode with HBase 2.4 pseudo-distributed cluster (1M rows ingested), all good. Test PR to run full build and track UT failures https://github.com/apache/hadoop/pull/4073, few tests are flaky but they are passing locally. On Mon, Mar 14, 2022 at 12:45 PM Masatake Iwasaki < iwasak...@oss.nttdata.co.jp> wrote: > Hi all, > > Here's Hadoop 3.2.3 release candidate #0: > > The RC is available at: >https://home.apache.org/~iwasakims/hadoop-3.2.3-RC0/ > > The RC tag is at: >https://github.com/apache/hadoop/releases/tag/release-3.2.3-RC0 > > The Maven artifacts are staged at: >https://repository.apache.org/content/repositories/orgapachehadoop-1339 > > You can find my public key at: >https://downloads.apache.org/hadoop/common/KEYS > > Please evaluate the RC and vote. > > Thanks, > Masatake Iwasaki > > - > To unsubscribe, e-mail: hdfs-dev-unsubscr...@hadoop.apache.org > For additional commands, e-mail: hdfs-dev-h...@hadoop.apache.org > >
[jira] [Created] (HADOOP-18159) Certificate doesn't match any of the subject alternative names: [*.s3.amazonaws.com, s3.amazonaws.com]
André F. created HADOOP-18159: - Summary: Certificate doesn't match any of the subject alternative names: [*.s3.amazonaws.com, s3.amazonaws.com] Key: HADOOP-18159 URL: https://issues.apache.org/jira/browse/HADOOP-18159 Project: Hadoop Common Issue Type: Bug Components: fs/s3 Affects Versions: 3.3.1 Environment: hadoop 3.3.1 JDK8 Reporter: André F. Trying to run any job after bumping our Spark version (which is now using Hadoop 3.3.1), lead us to the current exception while reading files on s3: {code:java} org.apache.hadoop.fs.s3a.AWSClientIOException: getFileStatus on s3a:///.parquet: com.amazonaws.SdkClientException: Unable to execute HTTP request: Certificate for doesn't match any of the subject alternative names: [*.s3.amazonaws.com, s3.amazonaws.com]: Unable to execute HTTP request: Certificate for doesn't match any of the subject alternative names: [*.s3.amazonaws.com, s3.amazonaws.com] at org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:208) at org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:170) at org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:3351) at org.apache.hadoop.fs.s3a.S3AFileSystem.innerGetFileStatus(S3AFileSystem.java:3185) at org.apache.hadoop.fs.s3a.S3AFileSystem.isDirectory(S3AFileSystem.java:4277) at {code} {code:java} Caused by: javax.net.ssl.SSLPeerUnverifiedException: Certificate for doesn't match any of the subject alternative names: [*.s3.amazonaws.com, s3.amazonaws.com] at com.amazonaws.thirdparty.apache.http.conn.ssl.SSLConnectionSocketFactory.verifyHostname(SSLConnectionSocketFactory.java:507) at com.amazonaws.thirdparty.apache.http.conn.ssl.SSLConnectionSocketFactory.createLayeredSocket(SSLConnectionSocketFactory.java:437) at com.amazonaws.thirdparty.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:384) at com.amazonaws.thirdparty.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142) at com.amazonaws.thirdparty.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:376) at sun.reflect.GeneratedMethodAccessor36.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.amazonaws.http.conn.ClientConnectionManagerFactory$Handler.invoke(ClientConnectionManagerFactory.java:76) at com.amazonaws.http.conn.$Proxy16.connect(Unknown Source) at com.amazonaws.thirdparty.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:393) at com.amazonaws.thirdparty.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:236) at com.amazonaws.thirdparty.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:186) at com.amazonaws.thirdparty.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185) at com.amazonaws.thirdparty.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83) at com.amazonaws.thirdparty.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:56) at com.amazonaws.http.apache.client.impl.SdkHttpClient.execute(SdkHttpClient.java:72) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1333) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1145) {code} We found similar problems in the following tickets but: - https://issues.apache.org/jira/browse/HADOOP-17017 (we don't use `.` in our bucket names) - [https://github.com/aws/aws-sdk-java-v2/issues/1786] (we tried to override it by using `httpclient:4.5.10` or `httpclient:4.5.8`, with no effect). We couldn't test it using the native `openssl` configuration due to our setup, so we would like to stick with the java ssl implementation, if possible. -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-dev-h...@hadoop.apache.org
[jira] [Created] (HADOOP-18160) `wildfly.openssl` should not be shaded by Hadoop build
André F. created HADOOP-18160: - Summary: `wildfly.openssl` should not be shaded by Hadoop build Key: HADOOP-18160 URL: https://issues.apache.org/jira/browse/HADOOP-18160 Project: Hadoop Common Issue Type: Bug Components: build Affects Versions: 3.3.1 Environment: hadoop 3.3.1 spark 3.2.1 JDK8 Reporter: André F. `org.wildfly.openssl` is a runtime library and its references are being shaded on Hadoop, breaking the integration with other frameworks like Spark, whenever the "fs.s3a.ssl.channel.mode" is set to "openssl". The error produced in this situation is: {code:java} Suppressed: java.lang.NoClassDefFoundError: org/apache/hadoop/shaded/org/wildfly/openssl/OpenSSLProvider{code} Whenever it tries to be instantiated from the `DelegatingSSLSocketFactory`. Spark tries to add it to its classpath without the shade, thus creating this issue. Dependencies which are not on "compile" scope should probably not be shaded to avoid this kind of integration issues. -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-dev-h...@hadoop.apache.org
Apache Hadoop qbt Report: trunk+JDK8 on Linux/x86_64
For more details, see https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/810/ [Mar 16, 2022 1:32:29 AM] (noreply) HDFS-16502. Reconfigure Block Invalidate limit (#4064) [Error replacing 'FILE' - Workspace is not accessible] - To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-dev-h...@hadoop.apache.org
[jira] [Created] (HADOOP-18161) [WASB] Retry not getting implemented when using wasb scheme in hadoop-azure 2.7.4
Aryan created HADOOP-18161: -- Summary: [WASB] Retry not getting implemented when using wasb scheme in hadoop-azure 2.7.4 Key: HADOOP-18161 URL: https://issues.apache.org/jira/browse/HADOOP-18161 Project: Hadoop Common Issue Type: Bug Components: fs/azure, hadoop-thirdparty Affects Versions: 2.7.4 Reporter: Aryan I am using prestodb to read data from blob. Presto is using hadoop-azure-2.7.4 jar. I'm using *wasb* scheme to query the data on blob. I'm afraid for some reason the hadoop-azure library is not retrying when getting IO exception. Attaching the stack trace below, {code:java} com.facebook.presto.spi.PrestoException: Error reading from wasb://oemdpv3prd...@oemdpv3prd.blob.core.windows.net/data/pipelines/hudi/kafka/telemetrics_v2/dp.hmi.quectel.bms.data.packet.v2/dt=2022-01-15/e576abc3-942a-434d-be02-6899798258eb-0_5-13327-290407_20220115211203.parquet at position 65924529 at com.facebook.presto.hive.parquet.HdfsParquetDataSource.readInternal(HdfsParquetDataSource.java:66) at com.facebook.presto.parquet.AbstractParquetDataSource.readFully(AbstractParquetDataSource.java:60) at com.facebook.presto.parquet.AbstractParquetDataSource.readFully(AbstractParquetDataSource.java:51) at com.facebook.presto.parquet.reader.ParquetReader.readPrimitive(ParquetReader.java:247) at com.facebook.presto.parquet.reader.ParquetReader.readColumnChunk(ParquetReader.java:330) at com.facebook.presto.parquet.reader.ParquetReader.readBlock(ParquetReader.java:313) at com.facebook.presto.hive.parquet.ParquetPageSource$ParquetBlockLoader.load(ParquetPageSource.java:182) at com.facebook.presto.hive.parquet.ParquetPageSource$ParquetBlockLoader.load(ParquetPageSource.java:160) at com.facebook.presto.common.block.LazyBlock.assureLoaded(LazyBlock.java:291) at com.facebook.presto.common.block.LazyBlock.getLoadedBlock(LazyBlock.java:282) at com.facebook.presto.operator.ScanFilterAndProjectOperator$RecordingLazyBlockLoader.load(ScanFilterAndProjectOperator.java:314) at com.facebook.presto.operator.ScanFilterAndProjectOperator$RecordingLazyBlockLoader.load(ScanFilterAndProjectOperator.java:300) at com.facebook.presto.common.block.LazyBlock.assureLoaded(LazyBlock.java:291) at com.facebook.presto.common.block.LazyBlock.getLoadedBlock(LazyBlock.java:282) at com.facebook.presto.operator.project.InputPageProjection.project(InputPageProjection.java:69) at com.facebook.presto.operator.project.PageProjectionWithOutputs.project(PageProjectionWithOutputs.java:56) at com.facebook.presto.operator.project.PageProcessor$ProjectSelectedPositions.processBatch(PageProcessor.java:323) at com.facebook.presto.operator.project.PageProcessor$ProjectSelectedPositions.process(PageProcessor.java:197) at com.facebook.presto.operator.WorkProcessorUtils$ProcessWorkProcessor.process(WorkProcessorUtils.java:315) at com.facebook.presto.operator.WorkProcessorUtils$YieldingIterator.computeNext(WorkProcessorUtils.java:79) at com.facebook.presto.operator.WorkProcessorUtils$YieldingIterator.computeNext(WorkProcessorUtils.java:65) at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:141) at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:136) at com.facebook.presto.operator.project.MergingPageOutput.getOutput(MergingPageOutput.java:113) at com.facebook.presto.operator.ScanFilterAndProjectOperator.processPageSource(ScanFilterAndProjectOperator.java:295) at com.facebook.presto.operator.ScanFilterAndProjectOperator.getOutput(ScanFilterAndProjectOperator.java:242) at com.facebook.presto.operator.Driver.processInternal(Driver.java:418) at com.facebook.presto.operator.Driver.lambda$processFor$9(Driver.java:301) at com.facebook.presto.operator.Driver.tryWithLock(Driver.java:722) at com.facebook.presto.operator.Driver.processFor(Driver.java:294) at com.facebook.presto.execution.SqlTaskExecution$DriverSplitRunner.processFor(SqlTaskExecution.java:1077) at com.facebook.presto.execution.executor.PrioritizedSplitRunner.process(PrioritizedSplitRunner.java:162) at com.facebook.presto.execution.executor.TaskExecutor$TaskRunner.run(TaskExecutor.java:599) at com.facebook.presto.$gen.Presto_0_256_2ed7f7320220112_133056_1.run(Unknown Source) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.io.IOException: undefined at com.facebook.presto.hadoop.$internal.com.microsoft.azure.storage.core.Utility.initIOException(Utility.java:598) at com.facebook.presto.hadoop.$internal.com.microsoft.azure.storage.blob.BlobInputStream.dispatchRead(BlobInputStream.
[jira] [Created] (HADOOP-18162) Hadoop common enhancements for the Manifest Committer ofMAPREDUCE-7341
Steve Loughran created HADOOP-18162: --- Summary: Hadoop common enhancements for the Manifest Committer ofMAPREDUCE-7341 Key: HADOOP-18162 URL: https://issues.apache.org/jira/browse/HADOOP-18162 Project: Hadoop Common Issue Type: Improvement Components: fs Affects Versions: 3.3.3 Reporter: Steve Loughran Make the necessary changes to hadoop-common to support the manifest committer of MAPREDUCE-7341j * new stats names in StoreStatisticNames (for joint use with s3a committers) * improvements to IOStatistics * s3a committer task pool to move over as ThreadPool this does not break s3a committer as they will only adopt these changes in HADOOP-17833 -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-dev-h...@hadoop.apache.org
[jira] [Created] (HADOOP-18163) hadoop-azure support for the Manifest Committer of MAPREDUCE-7341
Steve Loughran created HADOOP-18163: --- Summary: hadoop-azure support for the Manifest Committer of MAPREDUCE-7341 Key: HADOOP-18163 URL: https://issues.apache.org/jira/browse/HADOOP-18163 Project: Hadoop Common Issue Type: New Feature Components: fs/azure Affects Versions: 3.3.3 Reporter: Steve Loughran Assignee: Steve Loughran Follow-on patch to MAPREDUCE-7341: abfs support and tests * resilient rename * tests for job commit through the manifest committer. -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-dev-h...@hadoop.apache.org
[jira] [Created] (HADOOP-18164) Backport HADOOP-12760 to 2.10
Tom McCormick created HADOOP-18164: -- Summary: Backport HADOOP-12760 to 2.10 Key: HADOOP-18164 URL: https://issues.apache.org/jira/browse/HADOOP-18164 Project: Hadoop Common Issue Type: Improvement Components: common Affects Versions: 2.10.0 Reporter: Tom McCormick Fix For: 2.10.0 -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-dev-h...@hadoop.apache.org
Apache Hadoop qbt Report: branch-3.2+JDK8 on Linux/x86_64
For more details, see https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/38/ [Mar 10, 2022 7:35:40 AM] (Akira Ajisaka) YARN-10538: Add RECOMMISSIONING nodes to the list of updated nodes returned to the AM (#2564) [Mar 10, 2022 7:41:14 AM] (Masatake Iwasaki) YARN-9783. Remove low-level zookeeper test to be able to build Hadoop against zookeeper 3.5.5. Contributed by Mate Szalay-Beko. [Mar 10, 2022 9:56:04 AM] (Masatake Iwasaki) HADOOP-17718. Explicitly set locale in the Dockerfile. (#3034) [Mar 12, 2022 12:10:44 PM] (Xiaoqiao He) HDFS-16428. Source path with storagePolicy cause wrong typeConsumed while rename (#3898). Contributed by lei w. [Mar 14, 2022 1:04:47 AM] (Masatake Iwasaki) HADOOP-16811: Use JUnit TemporaryFolder Rule in TestFileUtils (#1811). Contributed by David Mollitor. [Mar 14, 2022 1:05:01 AM] (Masatake Iwasaki) HADOOP-18155. Refactor tests in TestFileUtil (#4063) -1 overall The following subsystems voted -1: asflicense blanks hadolint pathlen unit xml The following subsystems voted -1 but were configured to be filtered/ignored: cc checkstyle javac javadoc pylint shellcheck The following subsystems are considered long running: (runtime bigger than 1h 0m 0s) unit Specific tests: XML : Parsing Error(s): hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-excerpt.xml hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-missing-tags.xml hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-missing-tags2.xml hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-sample-output.xml Failed junit tests : hadoop.hdfs.TestReconstructStripedFileWithValidator hadoop.hdfs.server.datanode.TestBPOfferService hadoop.hdfs.tools.offlineImageViewer.TestOfflineImageViewer hadoop.hdfs.server.blockmanagement.TestUnderReplicatedBlocks hadoop.yarn.server.timelineservice.security.TestTimelineAuthFilterForV2 hadoop.mapred.uploader.TestFrameworkUploader hadoop.yarn.sls.TestSLSStreamAMSynth cc: https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/38/artifact/out/results-compile-cc-root.txt [48K] javac: https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/38/artifact/out/results-compile-javac-root.txt [336K] blanks: https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/38/artifact/out/blanks-eol.txt [13M] https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/38/artifact/out/blanks-tabs.txt [2.0M] checkstyle: https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/38/artifact/out/results-checkstyle-root.txt [14M] hadolint: https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/38/artifact/out/results-hadolint.txt [8.0K] pathlen: https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/38/artifact/out/results-pathlen.txt [16K] pylint: https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/38/artifact/out/results-pylint.txt [148K] shellcheck: https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/38/artifact/out/results-shellcheck.txt [20K] xml: https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/38/artifact/out/xml.txt [16K] javadoc: https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/38/artifact/out/results-javadoc-javadoc-root.txt [1.7M] unit: https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/38/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt [532K] https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/38/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-tests.txt [12K] https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/38/artifact/out/patch-unit-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-uploader.txt [12K] https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/38/artifact/out/patch-unit-hadoop-tools_hadoop-sls.txt [12K] asflicense: https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/38/artifact/out/results-asflicense.txt [4.0K] Powered by Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org - To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-dev-h...@hadoop.apache.org
Re: [VOTE] Release Apache Hadoop 3.2.3 - RC0
Hi, There is no aarch64 artifact in the release candidate. Is this something that is intended? Best, Emil Ejbyfeldt On 14/03/2022 08:14, Masatake Iwasaki wrote: Hi all, Here's Hadoop 3.2.3 release candidate #0: The RC is available at: https://home.apache.org/~iwasakims/hadoop-3.2.3-RC0/ The RC tag is at: https://github.com/apache/hadoop/releases/tag/release-3.2.3-RC0 The Maven artifacts are staged at: https://repository.apache.org/content/repositories/orgapachehadoop-1339 You can find my public key at: https://downloads.apache.org/hadoop/common/KEYS Please evaluate the RC and vote. Thanks, Masatake Iwasaki - To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-dev-h...@hadoop.apache.org - To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-dev-h...@hadoop.apache.org
Re: [VOTE] Release Apache Hadoop 3.2.3 - RC0
aarch64 support is only introduced in/after 3.3.0 On Thu, Mar 17, 2022 at 2:27 PM Emil Ejbyfeldt wrote: > Hi, > > > There is no aarch64 artifact in the release candidate. Is this something > that is intended? > > Best, > Emil Ejbyfeldt > > On 14/03/2022 08:14, Masatake Iwasaki wrote: > > Hi all, > > > > Here's Hadoop 3.2.3 release candidate #0: > > > > The RC is available at: > >https://home.apache.org/~iwasakims/hadoop-3.2.3-RC0/ > > > > The RC tag is at: > >https://github.com/apache/hadoop/releases/tag/release-3.2.3-RC0 > > > > The Maven artifacts are staged at: > > > https://repository.apache.org/content/repositories/orgapachehadoop-1339 > > > > You can find my public key at: > >https://downloads.apache.org/hadoop/common/KEYS > > > > Please evaluate the RC and vote. > > > > Thanks, > > Masatake Iwasaki > > > > - > > To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org > > For additional commands, e-mail: common-dev-h...@hadoop.apache.org > > > > - > To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org > For additional commands, e-mail: common-dev-h...@hadoop.apache.org > >