[ https://issues.apache.org/jira/browse/HIVE-21292?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16772507#comment-16772507 ]
Hive QA commented on HIVE-21292: -------------------------------- Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12959307/HIVE-21292.01.patch {color:red}ERROR:{color} -1 due to build exiting with an error Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/16152/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/16152/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-16152/ Messages: {noformat} **** This message was trimmed, see log for full details **** Tests exited with: NonZeroExitCodeException Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ date '+%Y-%m-%d %T.%3N' 2019-02-20 01:25:20.065 + [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]] + export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + export PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m ' + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m ' + export 'MAVEN_OPTS=-Xmx1g ' + MAVEN_OPTS='-Xmx1g ' + cd /data/hiveptest/working/ + tee /data/hiveptest/logs/PreCommit-HIVE-Build-16152/source-prep.txt + [[ false == \t\r\u\e ]] + mkdir -p maven ivy + [[ git = \s\v\n ]] + [[ git = \g\i\t ]] + [[ -z master ]] + [[ -d apache-github-source-source ]] + [[ ! -d apache-github-source-source/.git ]] + [[ ! -d apache-github-source-source ]] + date '+%Y-%m-%d %T.%3N' 2019-02-20 01:25:20.068 + cd apache-github-source-source + git fetch origin + git reset --hard HEAD HEAD is now at 104fa19 Hive replication to a target with hive.strict.managed.tables enabled is failing when used HMS on postgres. (Mahesh Kumar Behera, reviewed by Sankar Hariappan) + git clean -f -d Removing ${project.basedir}/ Removing itests/${project.basedir}/ Removing standalone-metastore/metastore-server/src/gen/ + git checkout master Already on 'master' Your branch is up-to-date with 'origin/master'. + git reset --hard origin/master HEAD is now at 104fa19 Hive replication to a target with hive.strict.managed.tables enabled is failing when used HMS on postgres. (Mahesh Kumar Behera, reviewed by Sankar Hariappan) + git merge --ff-only origin/master Already up-to-date. + date '+%Y-%m-%d %T.%3N' 2019-02-20 01:25:20.763 + rm -rf ../yetus_PreCommit-HIVE-Build-16152 + mkdir ../yetus_PreCommit-HIVE-Build-16152 + git gc + cp -R . ../yetus_PreCommit-HIVE-Build-16152 + mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-16152/yetus + patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh + patchFilePath=/data/hiveptest/working/scratch/build.patch + [[ -f /data/hiveptest/working/scratch/build.patch ]] + chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh + /data/hiveptest/working/scratch/smart-apply-patch.sh /data/hiveptest/working/scratch/build.patch error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/DDLTask.java: does not exist in index error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/ExportTask.java: does not exist in index error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/ReplCopyTask.java: does not exist in index error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/Task.java: does not exist in index error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/TaskFactory.java: does not exist in index error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/repl/ExternalTableCopyTaskBuilder.java: does not exist in index error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/repl/ReplDumpTask.java: does not exist in index error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/repl/ReplLoadTask.java: does not exist in index error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/repl/bootstrap/load/LoadDatabase.java: does not exist in index error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/repl/incremental/IncrementalLoadTasksBuilder.java: does not exist in index error: a/ql/src/java/org/apache/hadoop/hive/ql/lockmgr/DbTxnManager.java: does not exist in index error: a/ql/src/java/org/apache/hadoop/hive/ql/lockmgr/HiveTxnManager.java: does not exist in index error: a/ql/src/java/org/apache/hadoop/hive/ql/lockmgr/HiveTxnManagerImpl.java: does not exist in index error: a/ql/src/java/org/apache/hadoop/hive/ql/parse/DDLSemanticAnalyzer.java: does not exist in index error: a/ql/src/java/org/apache/hadoop/hive/ql/parse/repl/load/message/AlterDatabaseHandler.java: does not exist in index error: a/ql/src/java/org/apache/hadoop/hive/ql/parse/repl/load/message/CreateDatabaseHandler.java: does not exist in index error: a/ql/src/java/org/apache/hadoop/hive/ql/parse/repl/load/message/DropDatabaseHandler.java: does not exist in index error: a/ql/src/java/org/apache/hadoop/hive/ql/plan/AlterDatabaseDesc.java: does not exist in index error: a/ql/src/java/org/apache/hadoop/hive/ql/plan/CreateDatabaseDesc.java: does not exist in index error: a/ql/src/java/org/apache/hadoop/hive/ql/plan/DDLWork.java: does not exist in index error: a/ql/src/java/org/apache/hadoop/hive/ql/plan/DescDatabaseDesc.java: does not exist in index error: a/ql/src/java/org/apache/hadoop/hive/ql/plan/DropDatabaseDesc.java: does not exist in index error: a/ql/src/java/org/apache/hadoop/hive/ql/plan/LockDatabaseDesc.java: does not exist in index error: a/ql/src/java/org/apache/hadoop/hive/ql/plan/ShowDatabasesDesc.java: does not exist in index error: a/ql/src/java/org/apache/hadoop/hive/ql/plan/SwitchDatabaseDesc.java: does not exist in index error: a/ql/src/java/org/apache/hadoop/hive/ql/plan/UnlockDatabaseDesc.java: does not exist in index error: a/ql/src/test/results/clientnegative/database_create_already_exists.q.out: does not exist in index error: a/ql/src/test/results/clientnegative/database_create_invalid_name.q.out: does not exist in index error: a/ql/src/test/results/clientnegative/database_drop_not_empty.q.out: does not exist in index error: a/ql/src/test/results/clientnegative/database_drop_not_empty_restrict.q.out: does not exist in index error: a/ql/src/test/results/clientnegative/dbtxnmgr_nodblock.q.out: does not exist in index error: a/ql/src/test/results/clientnegative/lockneg_query_tbl_in_locked_db.q.out: does not exist in index error: a/ql/src/test/results/clientpositive/encrypted/encryption_move_tbl.q.out: does not exist in index Going to apply patch with: git apply -p1 /data/hiveptest/working/scratch/build.patch:634: trailing whitespace. /data/hiveptest/working/scratch/build.patch:4128: trailing whitespace. FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.ddl.DDLTask2. Database lockneg1 is not locked warning: 2 lines add whitespace errors. + [[ maven == \m\a\v\e\n ]] + rm -rf /data/hiveptest/working/maven/org/apache/hive + mvn -B clean install -DskipTests -T 4 -q -Dmaven.repo.local=/data/hiveptest/working/maven protoc-jar: executing: [/tmp/protoc453133535937321504.exe, --version] libprotoc 2.5.0 protoc-jar: executing: [/tmp/protoc453133535937321504.exe, -I/data/hiveptest/working/apache-github-source-source/standalone-metastore/metastore-common/src/main/protobuf/org/apache/hadoop/hive/metastore, --java_out=/data/hiveptest/working/apache-github-source-source/standalone-metastore/metastore-common/target/generated-sources, /data/hiveptest/working/apache-github-source-source/standalone-metastore/metastore-common/src/main/protobuf/org/apache/hadoop/hive/metastore/metastore.proto] ANTLR Parser Generator Version 3.5.2 protoc-jar: executing: [/tmp/protoc5364676237139939695.exe, --version] libprotoc 2.5.0 ANTLR Parser Generator Version 3.5.2 Output file /data/hiveptest/working/apache-github-source-source/standalone-metastore/metastore-server/target/generated-sources/org/apache/hadoop/hive/metastore/parser/FilterParser.java does not exist: must build /data/hiveptest/working/apache-github-source-source/standalone-metastore/metastore-server/src/main/java/org/apache/hadoop/hive/metastore/parser/Filter.g org/apache/hadoop/hive/metastore/parser/Filter.g log4j:WARN No appenders could be found for logger (DataNucleus.Persistence). log4j:WARN Please initialize the log4j system properly. DataNucleus Enhancer (version 4.1.17) for API "JDO" DataNucleus Enhancer completed with success for 41 classes. ANTLR Parser Generator Version 3.5.2 Output file /data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveLexer.java does not exist: must build /data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveLexer.g org/apache/hadoop/hive/ql/parse/HiveLexer.g Output file /data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveParser.java does not exist: must build /data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g org/apache/hadoop/hive/ql/parse/HiveParser.g warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:2439:5: Decision can match input such as "KW_CHECK KW_DATETIME" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:2439:5: Decision can match input such as "KW_CHECK KW_DATE {LPAREN, StringLiteral}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:2439:5: Decision can match input such as "KW_CHECK KW_UNIONTYPE LESSTHAN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:2439:5: Decision can match input such as "KW_CHECK {KW_EXISTS, KW_TINYINT}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:2439:5: Decision can match input such as "KW_CHECK KW_STRUCT LESSTHAN" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:424:5: Decision can match input such as "KW_UNKNOWN" using multiple alternatives: 1, 10 As a result, alternative(s) 10 were disabled for that input Output file /data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HintParser.java does not exist: must build /data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HintParser.g org/apache/hadoop/hive/ql/parse/HintParser.g Generating vector expression code Generating vector expression test code Processing annotations Annotations processed Processing annotations No elements to process Processing annotations Annotations processed Processing annotations No elements to process Processing annotations Annotations processed Processing annotations No elements to process Feb 20, 2019 1:29:11 AM org.apache.jasper.servlet.TldScanner scanJars INFO: At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time. [ERROR] COMPILATION ERROR : [ERROR] /data/hiveptest/working/apache-github-source-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/cli/SemanticAnalysis/CreateDatabaseHook.java:[33,38] cannot find symbol symbol: class CreateDatabaseDesc location: package org.apache.hadoop.hive.ql.plan [ERROR] /data/hiveptest/working/apache-github-source-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/cli/SemanticAnalysis/HCatSemanticAnalyzer.java:[35,38] cannot find symbol symbol: class DescDatabaseDesc location: package org.apache.hadoop.hive.ql.plan [ERROR] /data/hiveptest/working/apache-github-source-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/cli/SemanticAnalysis/HCatSemanticAnalyzer.java:[37,38] cannot find symbol symbol: class DropDatabaseDesc location: package org.apache.hadoop.hive.ql.plan [ERROR] /data/hiveptest/working/apache-github-source-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/cli/SemanticAnalysis/HCatSemanticAnalyzer.java:[40,38] cannot find symbol symbol: class ShowDatabasesDesc location: package org.apache.hadoop.hive.ql.plan [ERROR] /data/hiveptest/working/apache-github-source-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/cli/SemanticAnalysis/HCatSemanticAnalyzer.java:[44,38] cannot find symbol symbol: class SwitchDatabaseDesc location: package org.apache.hadoop.hive.ql.plan [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.6.1:compile (default-compile) on project hive-hcatalog-core: Compilation failure: Compilation failure: [ERROR] /data/hiveptest/working/apache-github-source-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/cli/SemanticAnalysis/CreateDatabaseHook.java:[33,38] cannot find symbol [ERROR] symbol: class CreateDatabaseDesc [ERROR] location: package org.apache.hadoop.hive.ql.plan [ERROR] /data/hiveptest/working/apache-github-source-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/cli/SemanticAnalysis/HCatSemanticAnalyzer.java:[35,38] cannot find symbol [ERROR] symbol: class DescDatabaseDesc [ERROR] location: package org.apache.hadoop.hive.ql.plan [ERROR] /data/hiveptest/working/apache-github-source-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/cli/SemanticAnalysis/HCatSemanticAnalyzer.java:[37,38] cannot find symbol [ERROR] symbol: class DropDatabaseDesc [ERROR] location: package org.apache.hadoop.hive.ql.plan [ERROR] /data/hiveptest/working/apache-github-source-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/cli/SemanticAnalysis/HCatSemanticAnalyzer.java:[40,38] cannot find symbol [ERROR] symbol: class ShowDatabasesDesc [ERROR] location: package org.apache.hadoop.hive.ql.plan [ERROR] /data/hiveptest/working/apache-github-source-source/hcatalog/core/src/main/java/org/apache/hive/hcatalog/cli/SemanticAnalysis/HCatSemanticAnalyzer.java:[44,38] cannot find symbol [ERROR] symbol: class SwitchDatabaseDesc [ERROR] location: package org.apache.hadoop.hive.ql.plan [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hive-hcatalog-core + result=1 + '[' 1 -ne 0 ']' + rm -rf yetus_PreCommit-HIVE-Build-16152 + exit 1 ' {noformat} This message is automatically generated. ATTACHMENT ID: 12959307 - PreCommit-HIVE-Build > Break up DDLTask 1 - extract Database related operations > -------------------------------------------------------- > > Key: HIVE-21292 > URL: https://issues.apache.org/jira/browse/HIVE-21292 > Project: Hive > Issue Type: Improvement > Components: Hive > Affects Versions: 3.1.1 > Reporter: Miklos Gergely > Assignee: Miklos Gergely > Priority: Major > Fix For: 4.0.0 > > Attachments: HIVE-21292.01.patch > > > DDLTask is a huge class, more than 5000 lines long. The related DDLWork is > also a huge class, which has a field for each DDL operation it supports. The > goal is to refactor these in order to have everything cut into more > handleable classes under the package org.apache.hadoop.hive.ql.exec.ddl: > * have a separate class for each operation > * have a package for each operation group (database ddl, table ddl, etc), so > the amount of classes under a package is more manageable > * make all the requests (DDLDesc subclasses) immutable > * DDLTask should be agnostic to the actual operations > * right now let's ignore the issue of having some operations handled by > DDLTask which are not actual DDL operations (lock, unlock, desc...) > In the interim time when there are two DDLTask and DDLWork classes in the > code base the new ones in the new package are called DDLTask2 and DDLWork2 > thus avoiding the usage of fully qualified class names where both the old and > the new classes are in use. > Step #1: extract all the database related operations from the old DDLTask, > and move them under the new package. Also create the new internal framework. -- This message was sent by Atlassian JIRA (v7.6.3#76005)