[ 
https://issues.apache.org/jira/browse/HIVE-13241?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15202189#comment-15202189
 ] 

Hive QA commented on HIVE-13241:
--------------------------------



Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12793858/HIVE-13241.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/7306/testReport
Console output: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/7306/console
Test logs: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/logs/PreCommit-HIVE-TRUNK-Build-7306/

Messages:
{noformat}
**** This message was trimmed, see log for full details ****
[INFO] skip non existing resourceDirectory 
/data/hive-ptest/working/apache-github-source-source/hplsql/src/test/resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hplsql ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: 
/data/hive-ptest/working/apache-github-source-source/hplsql/target/tmp
    [mkdir] Created dir: 
/data/hive-ptest/working/apache-github-source-source/hplsql/target/warehouse
    [mkdir] Created dir: 
/data/hive-ptest/working/apache-github-source-source/hplsql/target/tmp/conf
     [copy] Copying 16 files to 
/data/hive-ptest/working/apache-github-source-source/hplsql/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ 
hive-hplsql ---
[INFO] Compiling 2 source files to 
/data/hive-ptest/working/apache-github-source-source/hplsql/target/test-classes
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hplsql ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hplsql ---
[INFO] Building jar: 
/data/hive-ptest/working/apache-github-source-source/hplsql/target/hive-hplsql-2.1.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ 
hive-hplsql ---
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hplsql ---
[INFO] Installing 
/data/hive-ptest/working/apache-github-source-source/hplsql/target/hive-hplsql-2.1.0-SNAPSHOT.jar
 to 
/data/hive-ptest/working/maven/org/apache/hive/hive-hplsql/2.1.0-SNAPSHOT/hive-hplsql-2.1.0-SNAPSHOT.jar
[INFO] Installing 
/data/hive-ptest/working/apache-github-source-source/hplsql/pom.xml to 
/data/hive-ptest/working/maven/org/apache/hive/hive-hplsql/2.1.0-SNAPSHOT/hive-hplsql-2.1.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive HWI 2.1.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hwi ---
[INFO] Deleting /data/hive-ptest/working/apache-github-source-source/hwi/target
[INFO] Deleting /data/hive-ptest/working/apache-github-source-source/hwi 
(includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-no-snapshots) @ 
hive-hwi ---
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hive-hwi ---
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hive-hwi 
---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory 
/data/hive-ptest/working/apache-github-source-source/hwi/src/main/resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hwi ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hwi ---
[INFO] Compiling 6 source files to 
/data/hive-ptest/working/apache-github-source-source/hwi/target/classes
[INFO] 
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ 
hive-hwi ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory 
/data/hive-ptest/working/apache-github-source-source/hwi/src/test/resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hwi ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: 
/data/hive-ptest/working/apache-github-source-source/hwi/target/tmp
    [mkdir] Created dir: 
/data/hive-ptest/working/apache-github-source-source/hwi/target/warehouse
    [mkdir] Created dir: 
/data/hive-ptest/working/apache-github-source-source/hwi/target/tmp/conf
     [copy] Copying 16 files to 
/data/hive-ptest/working/apache-github-source-source/hwi/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ 
hive-hwi ---
[INFO] Compiling 2 source files to 
/data/hive-ptest/working/apache-github-source-source/hwi/target/test-classes
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hwi ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hwi ---
[INFO] Building jar: 
/data/hive-ptest/working/apache-github-source-source/hwi/target/hive-hwi-2.1.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ 
hive-hwi ---
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hwi ---
[INFO] Installing 
/data/hive-ptest/working/apache-github-source-source/hwi/target/hive-hwi-2.1.0-SNAPSHOT.jar
 to 
/data/hive-ptest/working/maven/org/apache/hive/hive-hwi/2.1.0-SNAPSHOT/hive-hwi-2.1.0-SNAPSHOT.jar
[INFO] Installing 
/data/hive-ptest/working/apache-github-source-source/hwi/pom.xml to 
/data/hive-ptest/working/maven/org/apache/hive/hive-hwi/2.1.0-SNAPSHOT/hive-hwi-2.1.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Llap Server 2.1.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-llap-server ---
[INFO] Deleting 
/data/hive-ptest/working/apache-github-source-source/llap-server/target
[INFO] Deleting 
/data/hive-ptest/working/apache-github-source-source/llap-server (includes = 
[datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-no-snapshots) @ 
hive-llap-server ---
[INFO] 
[INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ 
hive-llap-server ---
[INFO] Source directory: 
/data/hive-ptest/working/apache-github-source-source/llap-server/src/gen/protobuf/gen-java
 added.
[INFO] Source directory: 
/data/hive-ptest/working/apache-github-source-source/llap-server/src/gen/thrift/gen-javabean
 added.
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ 
hive-llap-server ---
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ 
hive-llap-server ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 17 resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-llap-server ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ 
hive-llap-server ---
[INFO] Compiling 88 source files to 
/data/hive-ptest/working/apache-github-source-source/llap-server/target/classes
[INFO] -------------------------------------------------------------
[WARNING] COMPILATION WARNING : 
[INFO] -------------------------------------------------------------
[WARNING] 
/data/hive-ptest/working/apache-github-source-source/llap-server/src/java/org/apache/hadoop/hive/llap/cache/SimpleAllocator.java:[29,16]
 sun.misc.Cleaner is internal proprietary API and may be removed in a future 
release
[WARNING] 
/data/hive-ptest/working/apache-github-source-source/llap-server/src/java/org/apache/hadoop/hive/llap/cache/SimpleAllocator.java:[29,16]
 sun.misc.Cleaner is internal proprietary API and may be removed in a future 
release
[WARNING] 
/data/hive-ptest/working/apache-github-source-source/llap-server/src/java/org/apache/hadoop/hive/llap/cache/SimpleAllocator.java:[29,16]
 sun.misc.Cleaner is internal proprietary API and may be removed in a future 
release
[WARNING] 
/data/hive-ptest/working/apache-github-source-source/llap-server/src/java/org/apache/hadoop/hive/llap/cache/SimpleAllocator.java:[74,9]
 sun.misc.Cleaner is internal proprietary API and may be removed in a future 
release
[WARNING] 
/data/hive-ptest/working/apache-github-source-source/llap-server/src/java/org/apache/hadoop/hive/llap/daemon/impl/TaskRunnerCallable.java:
 Some input files use or override a deprecated API.
[WARNING] 
/data/hive-ptest/working/apache-github-source-source/llap-server/src/java/org/apache/hadoop/hive/llap/daemon/impl/TaskRunnerCallable.java:
 Recompile with -Xlint:deprecation for details.
[WARNING] 
/data/hive-ptest/working/apache-github-source-source/llap-server/src/java/org/apache/hadoop/hive/llap/shufflehandler/DirWatcher.java:
 Some input files use unchecked or unsafe operations.
[WARNING] 
/data/hive-ptest/working/apache-github-source-source/llap-server/src/java/org/apache/hadoop/hive/llap/shufflehandler/DirWatcher.java:
 Recompile with -Xlint:unchecked for details.
[INFO] 8 warnings 
[INFO] -------------------------------------------------------------
[INFO] -------------------------------------------------------------
[ERROR] COMPILATION ERROR : 
[INFO] -------------------------------------------------------------
[ERROR] 
/data/hive-ptest/working/apache-github-source-source/llap-server/src/java/org/apache/hadoop/hive/llap/io/metadata/OrcFileEstimateErrors.java:[98,10]
 method addError in class 
org.apache.hadoop.hive.llap.io.metadata.OrcFileEstimateErrors cannot be applied 
to given types;
  required: long,int,long
  found: long,int
  reason: actual and formal argument lists differ in length
[INFO] 1 error
[INFO] -------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Hive .............................................. SUCCESS [12.553s]
[INFO] Hive Shims Common ................................. SUCCESS [16.532s]
[INFO] Hive Shims 0.23 ................................... SUCCESS [14.955s]
[INFO] Hive Shims Scheduler .............................. SUCCESS [2.668s]
[INFO] Hive Shims ........................................ SUCCESS [2.638s]
[INFO] Hive Storage API .................................. SUCCESS [6.657s]
[INFO] Hive ORC .......................................... SUCCESS [13.682s]
[INFO] Hive Common ....................................... SUCCESS [27.313s]
[INFO] Hive Serde ........................................ SUCCESS [17.337s]
[INFO] Hive Metastore .................................... SUCCESS [55.183s]
[INFO] Hive Ant Utilities ................................ SUCCESS [1.842s]
[INFO] Hive Llap Common .................................. SUCCESS [18.581s]
[INFO] Hive Llap Client .................................. SUCCESS [11.432s]
[INFO] Hive Llap Tez ..................................... SUCCESS [11.645s]
[INFO] Hive Service RPC .................................. SUCCESS [7.398s]
[INFO] Spark Remote Client ............................... SUCCESS [12.820s]
[INFO] Hive Query Language ............................... SUCCESS [2:12.171s]
[INFO] Hive Service ...................................... SUCCESS [14.615s]
[INFO] Hive Accumulo Handler ............................. SUCCESS [6.017s]
[INFO] Hive JDBC ......................................... SUCCESS [18.657s]
[INFO] Hive Beeline ...................................... SUCCESS [3.759s]
[INFO] Hive CLI .......................................... SUCCESS [3.165s]
[INFO] Hive Contrib ...................................... SUCCESS [2.105s]
[INFO] Hive HBase Handler ................................ SUCCESS [9.307s]
[INFO] Hive HCatalog ..................................... SUCCESS [0.982s]
[INFO] Hive HCatalog Core ................................ SUCCESS [4.968s]
[INFO] Hive HCatalog Pig Adapter ......................... SUCCESS [3.310s]
[INFO] Hive HCatalog Server Extensions ................... SUCCESS [2.534s]
[INFO] Hive HCatalog Webhcat Java Client ................. SUCCESS [2.651s]
[INFO] Hive HCatalog Webhcat ............................. SUCCESS [18.794s]
[INFO] Hive HCatalog Streaming ........................... SUCCESS [3.405s]
[INFO] Hive HPL/SQL ...................................... SUCCESS [18.546s]
[INFO] Hive HWI .......................................... SUCCESS [1.699s]
[INFO] Hive Llap Server .................................. FAILURE [4.609s]
[INFO] Hive Shims Aggregator ............................. SKIPPED
[INFO] Hive TestUtils .................................... SKIPPED
[INFO] Hive Packaging .................................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 8:07.817s
[INFO] Finished at: Fri Mar 18 17:27:26 EDT 2016
[INFO] Final Memory: 238M/1120M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "hadoop-2" could not be activated because it 
does not exist.
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on 
project hive-llap-server: Compilation failure
[ERROR] 
/data/hive-ptest/working/apache-github-source-source/llap-server/src/java/org/apache/hadoop/hive/llap/io/metadata/OrcFileEstimateErrors.java:[98,10]
 method addError in class 
org.apache.hadoop.hive.llap.io.metadata.OrcFileEstimateErrors cannot be applied 
to given types;
[ERROR] required: long,int,long
[ERROR] found: long,int
[ERROR] reason: actual and formal argument lists differ in length
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hive-llap-server
+ exit 1
'
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12793858 - PreCommit-HIVE-TRUNK-Build

> LLAP: Incremental Caching marks some small chunks as "incomplete CB"
> --------------------------------------------------------------------
>
>                 Key: HIVE-13241
>                 URL: https://issues.apache.org/jira/browse/HIVE-13241
>             Project: Hive
>          Issue Type: Bug
>            Reporter: Gopal V
>            Assignee: Sergey Shelukhin
>         Attachments: HIVE-13241.patch
>
>
> Run #3 of a query with 1 node still has cache misses.
> {code}
> LLAP IO Summary
> ----------------------------------------------------------------------------------------------
>   VERTICES ROWGROUPS  META_HIT  META_MISS  DATA_HIT  DATA_MISS  ALLOCATION    
>  USED  TOTAL_IO
> ----------------------------------------------------------------------------------------------
>      Map 1        11      1116          0    1.65GB    93.61MB          0B    
>    0B    32.72s
> ----------------------------------------------------------------------------------------------
> {code}
> {code}
> 2016-03-08T21:05:39,417 INFO  
> [IO-Elevator-Thread-9[attempt_1455662455106_2688_3_00_000001_0]]: 
> encoded.EncodedReaderImpl 
> (EncodedReaderImpl.java:prepareRangesForCompressedRead(695)) - Locking 
> 0x1c44401d(1) due to reuse
> 2016-03-08T21:05:39,417 INFO  
> [IO-Elevator-Thread-9[attempt_1455662455106_2688_3_00_000001_0]]: 
> encoded.EncodedReaderImpl 
> (EncodedReaderImpl.java:prepareRangesForCompressedRead(701)) - Adding an 
> already-uncompressed buffer 0x1c44401d(2)
> 2016-03-08T21:05:39,417 INFO  
> [IO-Elevator-Thread-9[attempt_1455662455106_2688_3_00_000001_0]]: 
> encoded.EncodedReaderImpl 
> (EncodedReaderImpl.java:prepareRangesForCompressedRead(695)) - Locking 
> 0x4e51b032(1) due to reuse
> 2016-03-08T21:05:39,417 INFO  
> [IO-Elevator-Thread-9[attempt_1455662455106_2688_3_00_000001_0]]: 
> encoded.EncodedReaderImpl 
> (EncodedReaderImpl.java:prepareRangesForCompressedRead(701)) - Adding an 
> already-uncompressed buffer 0x4e51b032(2)
> 2016-03-08T21:05:39,418 INFO  
> [IO-Elevator-Thread-9[attempt_1455662455106_2688_3_00_000001_0]]: 
> encoded.EncodedReaderImpl 
> (EncodedReaderImpl.java:addOneCompressionBuffer(1161)) - Found CB at 1373931, 
> chunk length 86587, total 86590, compressed
> 2016-03-08T21:05:39,418 INFO  
> [IO-Elevator-Thread-9[attempt_1455662455106_2688_3_00_000001_0]]: 
> encoded.EncodedReaderImpl 
> (EncodedReaderImpl.java:addIncompleteCompressionBuffer(1241)) - Replacing 
> data range [1373931, 1408408), size: 34474(!) type: direct (and 0 previous 
> chunks) with incomplete CB start: 1373931 end: 1408408 in the buffers
> 2016-03-08T21:05:39,418 INFO  
> [IO-Elevator-Thread-9[attempt_1455662455106_2688_3_00_000001_0]]: 
> encoded.EncodedReaderImpl 
> (EncodedReaderImpl.java:createRgColumnStreamData(441)) - Getting data for 
> column 7 RG 14 stream DATA at 1460521, 319811 index position 0: compressed 
> [1626961, 1780332)
> {code}
> {code}
> 2016-03-08T21:05:38,925 INFO  
> [IO-Elevator-Thread-7[attempt_1455662455106_2688_3_00_000001_0]]: 
> encoded.OrcEncodedDataReader (OrcEncodedDataReader.java:readFileData(878)) - 
> Disk ranges after disk read (file 5372745, base offset 3): [{start: 18986 
> end: 20660 cache buffer: 0x660faf7c(1)}, {start: 20660 end: 35775 cache 
> buffer: 0x1dcb1d97(1)}, {start: 318852 end: 422353 cache buffer: 
> 0x6c7f9a05(1)}, {start: 1148616 end: 1262468 cache buffer: 0x196e1d41(1)}, 
> {start: 1262468 end: 1376342 cache buffer: 0x201255f(1)}, {data range 
> [1376342, 1410766), size: 34424 type: direct}, {start: 1631359 end: 1714694 
> cache buffer: 0x47e3a72d(1)}, {start: 1714694 end: 1785770 cache buffer: 
> 0x57dca266(1)}, {start: 4975035 end: 5095215 cache buffer: 0x3e3139c9(1)}, 
> {start: 5095215 end: 5197863 cache buffer: 0x3511c88d(1)}, {start: 7448387 
> end: 7572268 cache buffer: 0x6f11dbcd(1)}, {start: 7572268 end: 7696182 cache 
> buffer: 0x5d6c9bdb(1)}, {data range [7696182, 7710537), size: 14355 type: 
> direct}, {start: 8235756 end: 8345367 cache buffer: 0x6a241ece(1)}, {start: 
> 8345367 end: 8455009 cache buffer: 0x51caf6a7(1)}, {data range [8455009, 
> 8497906), size: 42897 type: direct}, {start: 9035815 end: 9159708 cache 
> buffer: 0x306480e0(1)}, {start: 9159708 end: 9283629 cache buffer: 
> 0x9ef7774(1)}, {data range [9283629, 9297965), size: 14336 type: direct}, 
> {start: 9989884 end: 10113731 cache buffer: 0x43f7cae9(1)}, {start: 10113731 
> end: 10237589 cache buffer: 0x458e63fe(1)}, {data range [10237589, 10252034), 
> size: 14445 type: direct}, {start: 11897896 end: 12021787 cache buffer: 
> 0x51f9982f(1)}, {start: 12021787 end: 12145656 cache buffer: 0x23df01b3(1)}, 
> {data range [12145656, 12160046), size: 14390 type: direct}, {start: 12851928 
> end: 12975795 cache buffer: 0x5e0237a3(1)}, {start: 12975795 end: 13099664 
> cache buffer: 0x68252e0e(1)}, {data range [13099664, 13114078), size: 14414 
> type: direct}, {start: 13805890 end: 13929768 cache buffer: 0x7500fbc5(1)}, 
> {start: 13929768 end: 14053619 cache buffer: 0x2e89be4f(1)}, {data range 
> [14053619, 14068040), size: 14421 type: direct}, {start: 14759988 end: 
> 14883857 cache buffer: 0x61f92b12(1)}, {start: 14883857 end: 15007724 cache 
> buffer: 0x20ed3c7d(1)}, {data range [15007724, 15022138), size: 14414 type: 
> direct}]
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to