[ https://issues.apache.org/jira/browse/HIVE-19964?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16522019#comment-16522019 ]
Hive QA commented on HIVE-19964: -------------------------------- Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12928929/HIVE-19964.2.patch {color:red}ERROR:{color} -1 due to build exiting with an error Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/12076/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/12076/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-12076/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Tests exited with: NonZeroExitCodeException Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ date '+%Y-%m-%d %T.%3N' 2018-06-25 08:39:18.579 + [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]] + export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + export PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m ' + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m ' + export 'MAVEN_OPTS=-Xmx1g ' + MAVEN_OPTS='-Xmx1g ' + cd /data/hiveptest/working/ + tee /data/hiveptest/logs/PreCommit-HIVE-Build-12076/source-prep.txt + [[ false == \t\r\u\e ]] + mkdir -p maven ivy + [[ git = \s\v\n ]] + [[ git = \g\i\t ]] + [[ -z master ]] + [[ -d apache-github-source-source ]] + [[ ! -d apache-github-source-source/.git ]] + [[ ! -d apache-github-source-source ]] + date '+%Y-%m-%d %T.%3N' 2018-06-25 08:39:18.583 + cd apache-github-source-source + git fetch origin >From https://github.com/apache/hive 37bcf7b..2277661 master -> origin/master + git reset --hard HEAD HEAD is now at 37bcf7b HIVE-19941 : Row based Filters added via Hive Ranger policies are not pushed to druid (Nishant Bangarwa via Jesus Camacho Rodriguez) + git clean -f -d + git checkout master Already on 'master' Your branch is behind 'origin/master' by 2 commits, and can be fast-forwarded. (use "git pull" to update your local branch) + git reset --hard origin/master HEAD is now at 2277661 HIVE-19922: TestMiniDruidKafkaCliDriver[druidkafkamini_basic] is flaky (Peter Vary, reviewed by Jason Dere) + git merge --ff-only origin/master Already up-to-date. + date '+%Y-%m-%d %T.%3N' 2018-06-25 08:39:20.256 + rm -rf ../yetus_PreCommit-HIVE-Build-12076 + mkdir ../yetus_PreCommit-HIVE-Build-12076 + git gc + cp -R . ../yetus_PreCommit-HIVE-Build-12076 + mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-12076/yetus + patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh + patchFilePath=/data/hiveptest/working/scratch/build.patch + [[ -f /data/hiveptest/working/scratch/build.patch ]] + chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh + /data/hiveptest/working/scratch/smart-apply-patch.sh /data/hiveptest/working/scratch/build.patch error: a/ql/src/java/org/apache/hadoop/hive/ql/parse/DDLSemanticAnalyzer.java: does not exist in index error: a/ql/src/test/results/clientpositive/llap/resourceplan.q.out: does not exist in index Going to apply patch with: git apply -p1 + [[ maven == \m\a\v\e\n ]] + rm -rf /data/hiveptest/working/maven/org/apache/hive + mvn -B clean install -DskipTests -T 4 -q -Dmaven.repo.local=/data/hiveptest/working/maven protoc-jar: executing: [/tmp/protoc5755175417844837092.exe, --version] libprotoc 2.5.0 protoc-jar: executing: [/tmp/protoc5755175417844837092.exe, -I/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore, --java_out=/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources, /data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore/metastore.proto] ANTLR Parser Generator Version 3.5.2 Output file /data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources/org/apache/hadoop/hive/metastore/parser/FilterParser.java does not exist: must build /data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/parser/Filter.g org/apache/hadoop/hive/metastore/parser/Filter.g [ERROR] COMPILATION ERROR : [ERROR] /data/hiveptest/working/apache-github-source-source/serde/src/test/org/apache/hadoop/hive/serde2/io/TestTimestampWritable.java:[46,8] class TestTimestampWritableV2 is public, should be declared in a file named TestTimestampWritableV2.java [ERROR] /data/hiveptest/working/apache-github-source-source/serde/src/test/org/apache/hadoop/hive/serde2/io/TestDateWritable.java:[48,8] class TestDateWritableV2 is public, should be declared in a file named TestDateWritableV2.java [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.6.1:testCompile (default-testCompile) on project hive-serde: Compilation failure: Compilation failure: [ERROR] /data/hiveptest/working/apache-github-source-source/serde/src/test/org/apache/hadoop/hive/serde2/io/TestTimestampWritable.java:[46,8] class TestTimestampWritableV2 is public, should be declared in a file named TestTimestampWritableV2.java [ERROR] /data/hiveptest/working/apache-github-source-source/serde/src/test/org/apache/hadoop/hive/serde2/io/TestDateWritable.java:[48,8] class TestDateWritableV2 is public, should be declared in a file named TestDateWritableV2.java [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hive-serde + result=1 + '[' 1 -ne 0 ']' + rm -rf yetus_PreCommit-HIVE-Build-12076 + exit 1 ' {noformat} This message is automatically generated. ATTACHMENT ID: 12928929 - PreCommit-HIVE-Build > Apply resource plan fails if trigger expression has quotes > ---------------------------------------------------------- > > Key: HIVE-19964 > URL: https://issues.apache.org/jira/browse/HIVE-19964 > Project: Hive > Issue Type: Bug > Affects Versions: 3.1.0, 4.0.0 > Reporter: Aswathy Chellammal Sreekumar > Assignee: Prasanth Jayachandran > Priority: Major > Attachments: HIVE-19964.1.patch, HIVE-19964.2.patch > > > {code:java} > 0: jdbc:hive2://localhost:10000> CREATE TRIGGER global.big_hdfs_read WHEN > HDFS_BYTES_READ > '300kb' DO KILL; > INFO : Compiling > command(queryId=pjayachandran_20180621131017_72b1441b-d790-4db7-83ca-479735843890): > CREATE TRIGGER global.big_hdfs_read WHEN HDFS_BYTES_READ > '300kb' DO KILL > INFO : Semantic Analysis Completed (retrial = false) > INFO : Returning Hive schema: Schema(fieldSchemas:null, properties:null) > INFO : Completed compiling > command(queryId=pjayachandran_20180621131017_72b1441b-d790-4db7-83ca-479735843890); > Time taken: 0.015 seconds > INFO : Executing > command(queryId=pjayachandran_20180621131017_72b1441b-d790-4db7-83ca-479735843890): > CREATE TRIGGER global.big_hdfs_read WHEN HDFS_BYTES_READ > '300kb' DO KILL > INFO : Starting task [Stage-0:DDL] in serial mode > INFO : Completed executing > command(queryId=pjayachandran_20180621131017_72b1441b-d790-4db7-83ca-479735843890); > Time taken: 0.025 seconds > INFO : OK > No rows affected (0.054 seconds) > 0: jdbc:hive2://localhost:10000> ALTER TRIGGER global.big_hdfs_read ADD TO > UNMANAGED; > INFO : Compiling > command(queryId=pjayachandran_20180621131031_dd489324-db23-412f-9409-32ba697a10e5): > ALTER TRIGGER global.big_hdfs_read ADD TO UNMANAGED > INFO : Semantic Analysis Completed (retrial = false) > INFO : Returning Hive schema: Schema(fieldSchemas:null, properties:null) > INFO : Completed compiling > command(queryId=pjayachandran_20180621131031_dd489324-db23-412f-9409-32ba697a10e5); > Time taken: 0.014 seconds > INFO : Executing > command(queryId=pjayachandran_20180621131031_dd489324-db23-412f-9409-32ba697a10e5): > ALTER TRIGGER global.big_hdfs_read ADD TO UNMANAGED > INFO : Starting task [Stage-0:DDL] in serial mode > INFO : Completed executing > command(queryId=pjayachandran_20180621131031_dd489324-db23-412f-9409-32ba697a10e5); > Time taken: 0.029 seconds > INFO : OK > No rows affected (0.054 seconds) > 0: jdbc:hive2://localhost:10000> ALTER RESOURCE PLAN global ENABLE; > INFO : Compiling > command(queryId=pjayachandran_20180621131036_26a5f4f3-91e3-4bec-ab42-800adb90104e): > ALTER RESOURCE PLAN global ENABLE > INFO : Semantic Analysis Completed (retrial = false) > INFO : Returning Hive schema: Schema(fieldSchemas:null, properties:null) > INFO : Completed compiling > command(queryId=pjayachandran_20180621131036_26a5f4f3-91e3-4bec-ab42-800adb90104e); > Time taken: 0.012 seconds > INFO : Executing > command(queryId=pjayachandran_20180621131036_26a5f4f3-91e3-4bec-ab42-800adb90104e): > ALTER RESOURCE PLAN global ENABLE > INFO : Starting task [Stage-0:DDL] in serial mode > INFO : Completed executing > command(queryId=pjayachandran_20180621131036_26a5f4f3-91e3-4bec-ab42-800adb90104e); > Time taken: 0.021 seconds > INFO : OK > No rows affected (0.045 seconds) > 0: jdbc:hive2://localhost:10000> ALTER RESOURCE PLAN global ACTIVATE; > INFO : Compiling > command(queryId=pjayachandran_20180621131037_551b2af0-321b-4638-8ac0-76771a159f4b): > ALTER RESOURCE PLAN global ACTIVATE > INFO : Semantic Analysis Completed (retrial = false) > INFO : Returning Hive schema: Schema(fieldSchemas:null, properties:null) > INFO : Completed compiling > command(queryId=pjayachandran_20180621131037_551b2af0-321b-4638-8ac0-76771a159f4b); > Time taken: 0.017 seconds > INFO : Executing > command(queryId=pjayachandran_20180621131037_551b2af0-321b-4638-8ac0-76771a159f4b): > ALTER RESOURCE PLAN global ACTIVATE > INFO : Starting task [Stage-0:DDL] in serial mode > ERROR : FAILED: Execution Error, return code 1 from > org.apache.hadoop.hive.ql.exec.DDLTask. Invalid expression: HDFS_BYTES_READ > > 300kb > INFO : Completed executing > command(queryId=pjayachandran_20180621131037_551b2af0-321b-4638-8ac0-76771a159f4b); > Time taken: 0.037 seconds > Error: Error while processing statement: FAILED: Execution Error, return code > 1 from org.apache.hadoop.hive.ql.exec.DDLTask. Invalid expression: > HDFS_BYTES_READ > 300kb (state=08S01,code=1){code} -- This message was sent by Atlassian JIRA (v7.6.3#76005)