[ https://issues.apache.org/jira/browse/HIVE-21401?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16788222#comment-16788222 ]
Hive QA commented on HIVE-21401: -------------------------------- Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12961736/HIVE-21401.04.patch {color:red}ERROR:{color} -1 due to build exiting with an error Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/16415/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/16415/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-16415/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Tests exited with: NonZeroExitCodeException Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ date '+%Y-%m-%d %T.%3N' 2019-03-08 19:23:52.551 + [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]] + export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + export PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m ' + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m ' + export 'MAVEN_OPTS=-Xmx1g ' + MAVEN_OPTS='-Xmx1g ' + cd /data/hiveptest/working/ + tee /data/hiveptest/logs/PreCommit-HIVE-Build-16415/source-prep.txt + [[ false == \t\r\u\e ]] + mkdir -p maven ivy + [[ git = \s\v\n ]] + [[ git = \g\i\t ]] + [[ -z master ]] + [[ -d apache-github-source-source ]] + [[ ! -d apache-github-source-source/.git ]] + [[ ! -d apache-github-source-source ]] + date '+%Y-%m-%d %T.%3N' 2019-03-08 19:23:52.554 + cd apache-github-source-source + git fetch origin + git reset --hard HEAD HEAD is now at 0dd45a2 HIVE-21280 : Null pointer exception on running compaction against a MM table. (Aditya Shah via Ashutosh Chauhan) + git clean -f -d Removing standalone-metastore/metastore-server/src/gen/ + git checkout master Already on 'master' Your branch is up-to-date with 'origin/master'. + git reset --hard origin/master HEAD is now at 0dd45a2 HIVE-21280 : Null pointer exception on running compaction against a MM table. (Aditya Shah via Ashutosh Chauhan) + git merge --ff-only origin/master Already up-to-date. + date '+%Y-%m-%d %T.%3N' 2019-03-08 19:23:53.608 + rm -rf ../yetus_PreCommit-HIVE-Build-16415 + mkdir ../yetus_PreCommit-HIVE-Build-16415 + git gc + cp -R . ../yetus_PreCommit-HIVE-Build-16415 + mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-16415/yetus + patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh + patchFilePath=/data/hiveptest/working/scratch/build.patch + [[ -f /data/hiveptest/working/scratch/build.patch ]] + chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh + /data/hiveptest/working/scratch/smart-apply-patch.sh /data/hiveptest/working/scratch/build.patch error: patch failed: ql/src/java/org/apache/hadoop/hive/ql/exec/DDLTask.java:4803 Falling back to three-way merge... Applied patch to 'ql/src/java/org/apache/hadoop/hive/ql/exec/DDLTask.java' with conflicts. error: patch failed: ql/src/java/org/apache/hadoop/hive/ql/plan/ShowCreateDatabaseDesc.java:1 error: ql/src/java/org/apache/hadoop/hive/ql/plan/ShowCreateDatabaseDesc.java: patch does not apply error: core/src/main/java/org/apache/hive/hcatalog/cli/SemanticAnalysis/CreateTableHook.java: does not exist in index error: core/src/main/java/org/apache/hive/hcatalog/cli/SemanticAnalysis/HCatSemanticAnalyzer.java: does not exist in index error: util/src/main/java/org/apache/hadoop/hive/ql/metadata/DummySemanticAnalyzerHook.java: does not exist in index error: util/src/main/java/org/apache/hadoop/hive/ql/metadata/DummySemanticAnalyzerHook1.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/ddl/DDLOperation.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/ddl/DDLOperationContext.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/ddl/DDLTask2.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/ddl/DDLWork2.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/exec/DDLTask.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/exec/repl/bootstrap/load/table/LoadPartitions.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/exec/repl/bootstrap/load/table/LoadTable.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/io/AcidUtils.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/lockmgr/DbTxnManager.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/lockmgr/HiveTxnManager.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/lockmgr/HiveTxnManagerImpl.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/metadata/Hive.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/metadata/formatting/MetaDataFormatUtils.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/metadata/formatting/TextMetaDataFormatter.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/parse/AcidExportSemanticAnalyzer.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/parse/DDLSemanticAnalyzer.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/parse/ImportSemanticAnalyzer.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/parse/ParseContext.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/parse/PreInsertTableDesc.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/parse/QB.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/parse/TaskCompiler.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/parse/repl/load/message/DropPartitionHandler.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/parse/repl/load/message/DropTableHandler.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/parse/repl/load/message/TruncatePartitionHandler.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/parse/repl/load/message/TruncateTableHandler.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/plan/CreateTableDesc.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/plan/CreateTableLikeDesc.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/plan/DDLWork.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/plan/DescTableDesc.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/plan/DropTableDesc.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/plan/ImportTableDesc.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/plan/LoadFileDesc.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/plan/LockTableDesc.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/plan/PlanUtils.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/plan/ShowCreateDatabaseDesc.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/plan/ShowCreateTableDesc.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/plan/ShowTableStatusDesc.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/plan/ShowTablesDesc.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/plan/ShowTblPropertiesDesc.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/plan/TruncateTableDesc.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/plan/UnlockTableDesc.java: does not exist in index error: src/java/org/apache/hadoop/hive/ql/txn/compactor/CompactorMR.java: does not exist in index error: src/test/org/apache/hadoop/hive/ql/parse/TestHiveDecimalParse.java: does not exist in index error: src/main/java/org/apache/hive/hcatalog/cli/SemanticAnalysis/CreateTableHook.java: does not exist in index error: src/main/java/org/apache/hive/hcatalog/cli/SemanticAnalysis/HCatSemanticAnalyzer.java: does not exist in index error: src/main/java/org/apache/hadoop/hive/ql/metadata/DummySemanticAnalyzerHook.java: does not exist in index error: src/main/java/org/apache/hadoop/hive/ql/metadata/DummySemanticAnalyzerHook1.java: does not exist in index error: java/org/apache/hadoop/hive/ql/ddl/DDLOperation.java: does not exist in index error: java/org/apache/hadoop/hive/ql/ddl/DDLOperationContext.java: does not exist in index error: java/org/apache/hadoop/hive/ql/ddl/DDLTask2.java: does not exist in index error: java/org/apache/hadoop/hive/ql/ddl/DDLWork2.java: does not exist in index error: java/org/apache/hadoop/hive/ql/exec/DDLTask.java: does not exist in index error: java/org/apache/hadoop/hive/ql/exec/repl/bootstrap/load/table/LoadPartitions.java: does not exist in index error: java/org/apache/hadoop/hive/ql/exec/repl/bootstrap/load/table/LoadTable.java: does not exist in index error: java/org/apache/hadoop/hive/ql/io/AcidUtils.java: does not exist in index error: java/org/apache/hadoop/hive/ql/lockmgr/DbTxnManager.java: does not exist in index error: java/org/apache/hadoop/hive/ql/lockmgr/HiveTxnManager.java: does not exist in index error: java/org/apache/hadoop/hive/ql/lockmgr/HiveTxnManagerImpl.java: does not exist in index error: java/org/apache/hadoop/hive/ql/metadata/Hive.java: does not exist in index error: java/org/apache/hadoop/hive/ql/metadata/formatting/MetaDataFormatUtils.java: does not exist in index error: java/org/apache/hadoop/hive/ql/metadata/formatting/TextMetaDataFormatter.java: does not exist in index error: java/org/apache/hadoop/hive/ql/parse/AcidExportSemanticAnalyzer.java: does not exist in index error: java/org/apache/hadoop/hive/ql/parse/DDLSemanticAnalyzer.java: does not exist in index error: java/org/apache/hadoop/hive/ql/parse/ImportSemanticAnalyzer.java: does not exist in index error: java/org/apache/hadoop/hive/ql/parse/ParseContext.java: does not exist in index error: java/org/apache/hadoop/hive/ql/parse/PreInsertTableDesc.java: does not exist in index error: java/org/apache/hadoop/hive/ql/parse/QB.java: does not exist in index error: java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java: does not exist in index error: java/org/apache/hadoop/hive/ql/parse/TaskCompiler.java: does not exist in index error: java/org/apache/hadoop/hive/ql/parse/repl/load/message/DropPartitionHandler.java: does not exist in index error: java/org/apache/hadoop/hive/ql/parse/repl/load/message/DropTableHandler.java: does not exist in index error: java/org/apache/hadoop/hive/ql/parse/repl/load/message/TruncatePartitionHandler.java: does not exist in index error: java/org/apache/hadoop/hive/ql/parse/repl/load/message/TruncateTableHandler.java: does not exist in index error: java/org/apache/hadoop/hive/ql/plan/CreateTableDesc.java: does not exist in index error: java/org/apache/hadoop/hive/ql/plan/CreateTableLikeDesc.java: does not exist in index error: java/org/apache/hadoop/hive/ql/plan/DDLWork.java: does not exist in index error: java/org/apache/hadoop/hive/ql/plan/DescTableDesc.java: does not exist in index error: java/org/apache/hadoop/hive/ql/plan/DropTableDesc.java: does not exist in index error: java/org/apache/hadoop/hive/ql/plan/ImportTableDesc.java: does not exist in index error: java/org/apache/hadoop/hive/ql/plan/LoadFileDesc.java: does not exist in index error: java/org/apache/hadoop/hive/ql/plan/LockTableDesc.java: does not exist in index error: java/org/apache/hadoop/hive/ql/plan/PlanUtils.java: does not exist in index error: java/org/apache/hadoop/hive/ql/plan/ShowCreateDatabaseDesc.java: does not exist in index error: java/org/apache/hadoop/hive/ql/plan/ShowCreateTableDesc.java: does not exist in index error: java/org/apache/hadoop/hive/ql/plan/ShowTableStatusDesc.java: does not exist in index error: java/org/apache/hadoop/hive/ql/plan/ShowTablesDesc.java: does not exist in index error: java/org/apache/hadoop/hive/ql/plan/ShowTblPropertiesDesc.java: does not exist in index error: java/org/apache/hadoop/hive/ql/plan/TruncateTableDesc.java: does not exist in index error: java/org/apache/hadoop/hive/ql/plan/UnlockTableDesc.java: does not exist in index error: java/org/apache/hadoop/hive/ql/txn/compactor/CompactorMR.java: does not exist in index error: test/org/apache/hadoop/hive/ql/parse/TestHiveDecimalParse.java: does not exist in index The patch does not appear to apply with p0, p1, or p2 + result=1 + '[' 1 -ne 0 ']' + rm -rf yetus_PreCommit-HIVE-Build-16415 + exit 1 ' {noformat} This message is automatically generated. ATTACHMENT ID: 12961736 - PreCommit-HIVE-Build > Break up DDLTask - extract Table related operations > --------------------------------------------------- > > Key: HIVE-21401 > URL: https://issues.apache.org/jira/browse/HIVE-21401 > Project: Hive > Issue Type: Sub-task > Components: Hive > Affects Versions: 3.1.1 > Reporter: Miklos Gergely > Assignee: Miklos Gergely > Priority: Major > Fix For: 4.0.0 > > Attachments: HIVE-21401.01.patch, HIVE-21401.02.patch, > HIVE-21401.03.patch, HIVE-21401.04.patch > > > DDLTask is a huge class, more than 5000 lines long. The related DDLWork is > also a huge class, which has a field for each DDL operation it supports. The > goal is to refactor these in order to have everything cut into more > handleable classes under the package org.apache.hadoop.hive.ql.exec.ddl: > * have a separate class for each operation > * have a package for each operation group (database ddl, table ddl, etc), so > the amount of classes under a package is more manageable > * make all the requests (DDLDesc subclasses) immutable > * DDLTask should be agnostic to the actual operations > * right now let's ignore the issue of having some operations handled by > DDLTask which are not actual DDL operations (lock, unlock, desc...) > In the interim time when there are two DDLTask and DDLWork classes in the > code base the new ones in the new package are called DDLTask2 and DDLWork2 > thus avoiding the usage of fully qualified class names where both the old and > the new classes are in use. > Step #2: extract all the table related operations from the old DDLTask except > alter table, and move them under the new package. Also create the new > internal framework. -- This message was sent by Atlassian JIRA (v7.6.3#76005)