[ 
https://issues.apache.org/jira/browse/HIVE-18193?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16472473#comment-16472473
 ] 

Hive QA commented on HIVE-18193:
--------------------------------



Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12922796/HIVE-18193.01-branch-3.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/10831/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/10831/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-10831/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2018-05-11 18:45:13.081
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-10831/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z branch-3 ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2018-05-11 18:45:13.085
+ cd apache-github-source-source
+ git fetch origin
>From https://github.com/apache/hive
   9cfc15a..154a686  master     -> origin/master
   b331338..32e29cc  branch-3   -> origin/branch-3
+ git reset --hard HEAD
HEAD is now at 9cfc15a HIVE-19435: Incremental replication cause data loss if a 
table is dropped followed by create and insert-into with different partition 
type (Sankar Hariappan, reviewed by Mahesh Kumar Behera, Thejas M Nair)
+ git clean -f -d
+ git checkout branch-3
Switched to branch 'branch-3'
Your branch is behind 'origin/branch-3' by 4 commits, and can be fast-forwarded.
  (use "git pull" to update your local branch)
+ git reset --hard origin/branch-3
HEAD is now at 32e29cc HIVE-19453 : Extend Load Data statement to take Input 
file format and Serde as parameters (Deepak Jaiswal, reviewed by Jason Dere)
+ git merge --ff-only origin/branch-3
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2018-05-11 18:45:18.836
+ rm -rf ../yetus_PreCommit-HIVE-Build-10831
+ mkdir ../yetus_PreCommit-HIVE-Build-10831
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-10831
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-10831/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh 
/data/hiveptest/working/scratch/build.patch
error: 
a/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/txn/TxnDbUtil.java:
 does not exist in index
error: a/standalone-metastore/src/main/sql/derby/hive-schema-3.0.0.derby.sql: 
does not exist in index
error: 
a/standalone-metastore/src/main/sql/derby/upgrade-2.3.0-to-3.0.0.derby.sql: 
does not exist in index
error: a/standalone-metastore/src/main/sql/mssql/hive-schema-3.0.0.mssql.sql: 
does not exist in index
error: 
a/standalone-metastore/src/main/sql/mssql/upgrade-2.3.0-to-3.0.0.mssql.sql: 
does not exist in index
error: a/standalone-metastore/src/main/sql/mysql/hive-schema-3.0.0.mysql.sql: 
does not exist in index
error: 
a/standalone-metastore/src/main/sql/mysql/upgrade-2.3.0-to-3.0.0.mysql.sql: 
does not exist in index
error: a/standalone-metastore/src/main/sql/oracle/hive-schema-3.0.0.oracle.sql: 
does not exist in index
error: 
a/standalone-metastore/src/main/sql/oracle/upgrade-2.3.0-to-3.0.0.oracle.sql: 
does not exist in index
error: 
a/standalone-metastore/src/main/sql/postgres/hive-schema-3.0.0.postgres.sql: 
does not exist in index
error: 
a/standalone-metastore/src/main/sql/postgres/upgrade-2.3.0-to-3.0.0.postgres.sql:
 does not exist in index
Going to apply patch with: git apply -p1
+ [[ maven == \m\a\v\e\n ]]
+ rm -rf /data/hiveptest/working/maven/org/apache/hive
+ mvn -B clean install -DskipTests -T 4 -q 
-Dmaven.repo.local=/data/hiveptest/working/maven
protoc-jar: executing: [/tmp/protoc5755933008813138933.exe, --version]
libprotoc 2.5.0
protoc-jar: executing: [/tmp/protoc5755933008813138933.exe, 
-I/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore,
 
--java_out=/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources,
 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore/metastore.proto]
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources/org/apache/hadoop/hive/metastore/parser/FilterParser.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/parser/Filter.g
org/apache/hadoop/hive/metastore/parser/Filter.g
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-remote-resources-plugin:1.5:process 
(process-resource-bundles) on project hive-shims-scheduler: Execution 
process-resource-bundles of goal 
org.apache.maven.plugins:maven-remote-resources-plugin:1.5:process failed. 
ConcurrentModificationException -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hive-shims-scheduler
+ exit 1
'
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12922796 - PreCommit-HIVE-Build

> Migrate existing ACID tables to use write id per table rather than global 
> transaction id
> ----------------------------------------------------------------------------------------
>
>                 Key: HIVE-18193
>                 URL: https://issues.apache.org/jira/browse/HIVE-18193
>             Project: Hive
>          Issue Type: Sub-task
>          Components: HiveServer2, Transactions
>    Affects Versions: 3.0.0
>            Reporter: anishek
>            Assignee: Sankar Hariappan
>            Priority: Blocker
>              Labels: ACID, Upgrade
>             Fix For: 3.0.0, 3.1.0
>
>         Attachments: HIVE-18193.01-branch-3.patch, HIVE-18193.01.patch, 
> HIVE-18193.02.patch
>
>
> dependent upon HIVE-18192
> For existing ACID Tables we need to update the table level write id 
> metatables/sequences so any new operations on these tables works seamlessly 
> without any conflicting data in existing base/delta files.
> 1. Need to create metadata tables such as NEXT_WRITE_ID and TXN_TO_WRITE_ID.
> 2. Add entries for each ACID/MM tables into NEXT_WRITE_ID where NWI_NEXT is 
> set to current value of NEXT_TXN_ID.NTXN_NEXT.
> 3. All current open/abort transactions to have an entry in TXN_TO_WRITE_ID 
> such that T2W_TXNID=T2W_WRITEID=Open/AbortedTxnId.
> 4. Added new column TC_WRITEID in TXN_COMPONENTS and CTC_WRITEID in 
> COMPLETED_TXN_COMPONENTS to store the write id which should be set as 
> respective values of TC_TXNID and CTC_TXNID from the same row.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to