[ 
https://issues.apache.org/jira/browse/HIVE-18350?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16355173#comment-16355173
 ] 

Hive QA commented on HIVE-18350:
--------------------------------



Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12909489/HIVE-18350.15.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/9065/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/9065/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-9065/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2018-02-07 08:58:57.108
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-9065/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2018-02-07 08:58:57.111
+ cd apache-github-source-source
+ git fetch origin
>From https://github.com/apache/hive
   2422e18..acc62e3  master     -> origin/master
+ git reset --hard HEAD
HEAD is now at 2422e18 HIVE-18467: support whole warehouse dump / load + 
create/drop database events (Anishek Agarwal, reviewed by Sankar Hariappan)
+ git clean -f -d
+ git checkout master
Already on 'master'
Your branch is behind 'origin/master' by 3 commits, and can be fast-forwarded.
  (use "git pull" to update your local branch)
+ git reset --hard origin/master
HEAD is now at acc62e3 HIVE-18628: Make tez dag status check interval 
configurable (Prasanth Jayachandran reviewed by Sergey Shelukhin)
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2018-02-07 08:59:01.197
+ rm -rf ../yetus
+ mkdir ../yetus
+ git gc
+ cp -R . ../yetus
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-9065/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh 
/data/hiveptest/working/scratch/build.patch
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/exec/tez/CustomPartitionVertex.java: 
does not exist in index
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/exec/tez/CustomVertexConfiguration.java:
 does not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/tez/DagUtils.java: does not 
exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/metadata/Table.java: does not 
exist in index
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/optimizer/ConvertJoinMapJoin.java: does 
not exist in index
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/optimizer/metainfo/annotation/OpTraitsRulesProcFactory.java:
 does not exist in index
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/optimizer/spark/SparkMapJoinOptimizer.java:
 does not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/parse/LoadSemanticAnalyzer.java: 
does not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/plan/OpTraits.java: does not 
exist in index
error: a/ql/src/test/org/apache/hadoop/hive/ql/metadata/TestHive.java: does not 
exist in index
error: a/ql/src/test/queries/clientpositive/auto_sortmerge_join_2.q: does not 
exist in index
error: a/ql/src/test/queries/clientpositive/auto_sortmerge_join_4.q: does not 
exist in index
error: a/ql/src/test/queries/clientpositive/auto_sortmerge_join_5.q: does not 
exist in index
error: a/ql/src/test/queries/clientpositive/auto_sortmerge_join_7.q: does not 
exist in index
error: a/ql/src/test/results/clientnegative/bucket_mapjoin_mismatch1.q.out: 
does not exist in index
error: a/ql/src/test/results/clientpositive/auto_sortmerge_join_2.q.out: does 
not exist in index
error: a/ql/src/test/results/clientpositive/auto_sortmerge_join_4.q.out: does 
not exist in index
error: a/ql/src/test/results/clientpositive/auto_sortmerge_join_5.q.out: does 
not exist in index
error: a/ql/src/test/results/clientpositive/auto_sortmerge_join_7.q.out: does 
not exist in index
error: a/ql/src/test/results/clientpositive/llap/auto_sortmerge_join_2.q.out: 
does not exist in index
error: a/ql/src/test/results/clientpositive/llap/auto_sortmerge_join_4.q.out: 
does not exist in index
error: a/ql/src/test/results/clientpositive/llap/auto_sortmerge_join_5.q.out: 
does not exist in index
error: a/ql/src/test/results/clientpositive/llap/auto_sortmerge_join_7.q.out: 
does not exist in index
error: a/ql/src/test/results/clientpositive/spark/auto_sortmerge_join_2.q.out: 
does not exist in index
error: a/ql/src/test/results/clientpositive/spark/auto_sortmerge_join_4.q.out: 
does not exist in index
error: a/ql/src/test/results/clientpositive/spark/auto_sortmerge_join_5.q.out: 
does not exist in index
error: a/ql/src/test/results/clientpositive/spark/auto_sortmerge_join_7.q.out: 
does not exist in index
error: a/standalone-metastore/src/gen/thrift/gen-cpp/ThriftHiveMetastore.cpp: 
does not exist in index
error: a/standalone-metastore/src/gen/thrift/gen-cpp/hive_metastore_types.cpp: 
does not exist in index
error: a/standalone-metastore/src/gen/thrift/gen-cpp/hive_metastore_types.h: 
does not exist in index
error: 
a/standalone-metastore/src/gen/thrift/gen-javabean/org/apache/hadoop/hive/metastore/api/Table.java:
 does not exist in index
error: a/standalone-metastore/src/gen/thrift/gen-php/metastore/Types.php: does 
not exist in index
error: a/standalone-metastore/src/gen/thrift/gen-py/hive_metastore/ttypes.py: 
does not exist in index
error: a/standalone-metastore/src/gen/thrift/gen-rb/hive_metastore_types.rb: 
does not exist in index
error: 
a/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/ObjectStore.java:
 does not exist in index
error: 
a/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/model/MTable.java:
 does not exist in index
error: a/standalone-metastore/src/main/thrift/hive_metastore.thrift: does not 
exist in index
error: 
a/standalone-metastore/src/test/java/org/apache/hadoop/hive/metastore/cache/TestCachedStore.java:
 does not exist in index
error: 
a/standalone-metastore/src/test/java/org/apache/hadoop/hive/metastore/client/TestTablesCreateDropAlterTruncate.java:
 does not exist in index
Going to apply patch with: git apply -p1
/data/hiveptest/working/scratch/build.patch:16962: trailing whitespace.
     * 
/data/hiveptest/working/scratch/build.patch:16996: trailing whitespace.
    tmpMap.put(_Fields.BUCKETING_VERSION, new 
org.apache.thrift.meta_data.FieldMetaData("bucketingVersion", 
org.apache.thrift.TFieldRequirementType.OPTIONAL, 
/data/hiveptest/working/scratch/build.patch:16998: trailing whitespace.
    tmpMap.put(_Fields.LOAD_IN_BUCKETED_TABLE, new 
org.apache.thrift.meta_data.FieldMetaData("loadInBucketedTable", 
org.apache.thrift.TFieldRequirementType.OPTIONAL, 
/data/hiveptest/working/scratch/build.patch:17041: trailing whitespace.
   * 
/data/hiveptest/working/scratch/build.patch:17049: trailing whitespace.
   * 
warning: squelched 2 whitespace errors
warning: 7 lines add whitespace errors.
+ [[ maven == \m\a\v\e\n ]]
+ rm -rf /data/hiveptest/working/maven/org/apache/hive
+ mvn -B clean install -DskipTests -T 4 -q 
-Dmaven.repo.local=/data/hiveptest/working/maven
protoc-jar: protoc version: 2.5.0, detected platform: linux-x86_64 (linux/amd64)
protoc-jar: embedded: bin/2.5.0/protoc-2.5.0-linux-x86_64.exe
protoc-jar: executing: [/tmp/protocjar6798693510032262221/bin/protoc.exe, 
--version]
libprotoc 2.5.0
protoc-jar: executing: [/tmp/protocjar6798693510032262221/bin/protoc.exe, 
-I/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore,
 
--java_out=/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources,
 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore/metastore.proto]
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources/org/apache/hadoop/hive/metastore/parser/FilterParser.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/parser/Filter.g
org/apache/hadoop/hive/metastore/parser/Filter.g
[ERROR] COMPILATION ERROR : 
[ERROR] 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/gen/thrift/gen-javabean/org/apache/hadoop/hive/metastore/api/Table.java:[82,11]
 cannot find symbol
  symbol:   class BucketingVersion
  location: class org.apache.hadoop.hive.metastore.api.Table
[ERROR] 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/gen/thrift/gen-javabean/org/apache/hadoop/hive/metastore/api/Table.java:[774,10]
 cannot find symbol
  symbol:   class BucketingVersion
  location: class org.apache.hadoop.hive.metastore.api.Table
[ERROR] 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/gen/thrift/gen-javabean/org/apache/hadoop/hive/metastore/api/Table.java:[782,35]
 cannot find symbol
  symbol:   class BucketingVersion
  location: class org.apache.hadoop.hive.metastore.api.Table
[ERROR] 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/model/MTable.java:[21,44]
 cannot find symbol
  symbol:   class BucketingVersion
  location: package org.apache.hadoop.hive.metastore.api
[ERROR] 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/model/MTable.java:[42,11]
 cannot find symbol
  symbol:   class BucketingVersion
  location: class org.apache.hadoop.hive.metastore.model.MTable
[ERROR] 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/model/MTable.java:[65,25]
 cannot find symbol
  symbol:   class BucketingVersion
  location: class org.apache.hadoop.hive.metastore.model.MTable
[ERROR] 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/model/MTable.java:[283,35]
 cannot find symbol
  symbol:   class BucketingVersion
  location: class org.apache.hadoop.hive.metastore.model.MTable
[ERROR] 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/model/MTable.java:[290,10]
 cannot find symbol
  symbol:   class BucketingVersion
  location: class org.apache.hadoop.hive.metastore.model.MTable
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-compiler-plugin:3.5.1:compile (default-compile) 
on project hive-standalone-metastore: Compilation failure: Compilation failure:
[ERROR] 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/gen/thrift/gen-javabean/org/apache/hadoop/hive/metastore/api/Table.java:[82,11]
 cannot find symbol
[ERROR] symbol:   class BucketingVersion
[ERROR] location: class org.apache.hadoop.hive.metastore.api.Table
[ERROR] 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/gen/thrift/gen-javabean/org/apache/hadoop/hive/metastore/api/Table.java:[774,10]
 cannot find symbol
[ERROR] symbol:   class BucketingVersion
[ERROR] location: class org.apache.hadoop.hive.metastore.api.Table
[ERROR] 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/gen/thrift/gen-javabean/org/apache/hadoop/hive/metastore/api/Table.java:[782,35]
 cannot find symbol
[ERROR] symbol:   class BucketingVersion
[ERROR] location: class org.apache.hadoop.hive.metastore.api.Table
[ERROR] 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/model/MTable.java:[21,44]
 cannot find symbol
[ERROR] symbol:   class BucketingVersion
[ERROR] location: package org.apache.hadoop.hive.metastore.api
[ERROR] 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/model/MTable.java:[42,11]
 cannot find symbol
[ERROR] symbol:   class BucketingVersion
[ERROR] location: class org.apache.hadoop.hive.metastore.model.MTable
[ERROR] 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/model/MTable.java:[65,25]
 cannot find symbol
[ERROR] symbol:   class BucketingVersion
[ERROR] location: class org.apache.hadoop.hive.metastore.model.MTable
[ERROR] 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/model/MTable.java:[283,35]
 cannot find symbol
[ERROR] symbol:   class BucketingVersion
[ERROR] location: class org.apache.hadoop.hive.metastore.model.MTable
[ERROR] 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/model/MTable.java:[290,10]
 cannot find symbol
[ERROR] symbol:   class BucketingVersion
[ERROR] location: class org.apache.hadoop.hive.metastore.model.MTable
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hive-standalone-metastore
+ exit 1
'
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12909489 - PreCommit-HIVE-Build

> load data should rename files consistent with insert statements
> ---------------------------------------------------------------
>
>                 Key: HIVE-18350
>                 URL: https://issues.apache.org/jira/browse/HIVE-18350
>             Project: Hive
>          Issue Type: Bug
>            Reporter: Deepak Jaiswal
>            Assignee: Deepak Jaiswal
>            Priority: Major
>         Attachments: HIVE-18350.1.patch, HIVE-18350.10.patch, 
> HIVE-18350.11.patch, HIVE-18350.12.patch, HIVE-18350.13.patch, 
> HIVE-18350.14.patch, HIVE-18350.15.patch, HIVE-18350.2.patch, 
> HIVE-18350.3.patch, HIVE-18350.4.patch, HIVE-18350.5.patch, 
> HIVE-18350.6.patch, HIVE-18350.7.patch, HIVE-18350.8.patch, HIVE-18350.9.patch
>
>
> Insert statements create files of format ending with 0000_0, 0001_0 etc. 
> However, the load data uses the input file name. That results in inconsistent 
> naming convention which makes SMB joins difficult in some scenarios and may 
> cause trouble for other types of queries in future.
> We need consistent naming convention.
> For non-bucketed table, hive renames all the files regardless of how they 
> were named by the user.
>  For bucketed table, hive relies on user to name the files matching the 
> bucket in non-strict mode. Hive assumes that the data belongs to same bucket 
> in a file. In strict mode, loading bucketed table is disabled.
> This will likely affect most of the tests which load data which is pretty 
> significant due to which it is further divided into two subtasks for smoother 
> merge.
> For existing tables in customer database, it is recommended to reload 
> bucketed tables otherwise if customer tries to run SMB join and there is a 
> bucket for which there is no split, then there is a possibility of getting 
> incorrect results. However, this is not a regression as it would happen even 
> without the patch.
> With this patch however, and reloading data, the results should be correct.
> For non-bucketed tables and external tables, there is no difference in 
> behavior and reloading data is not needed.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to