[ https://issues.apache.org/jira/browse/HIVE-22031?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16890491#comment-16890491 ]
Hive QA commented on HIVE-22031: -------------------------------- Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12975412/HIVE-22031.02.patch {color:red}ERROR:{color} -1 due to no test(s) being added or modified. {color:red}ERROR:{color} -1 due to 37 failed/errored test(s), 16682 tests executed *Failed tests:* {noformat} org.apache.hadoop.hive.ql.TestTxnCommands2.testACIDwithSchemaEvolutionAndCompaction (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testAcidWithSchemaEvolution (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testAlterTable (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testBucketCodec (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testBucketizedInputFormat (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testCleanerForTxnToWriteId (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testCompactWithDelete (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testDeleteEventsCompaction (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testDeleteIn (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testDynamicPartitionsMerge (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testDynamicPartitionsMerge2 (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testETLSplitStrategyForACID (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testEmptyInTblproperties (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testFailHeartbeater (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testFailureOnAlteringTransactionalProperties (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testFileSystemUnCaching (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testInitiatorWithMultipleFailedCompactions (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testInsertOverwrite1 (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testInsertOverwrite2 (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testInsertOverwriteWithSelfJoin (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testMerge (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testMerge2 (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testMerge3 (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testMergeWithPredicate (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testMmTableCompaction (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testMultiInsert (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testMultiInsertStatement (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testNonAcidInsert (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testNonAcidToAcidConversion02 (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testOpenTxnsCounter (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testOrcNoPPD (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testOrcPPD (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testOriginalFileReaderWhenNonAcidConvertedToAcid (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testUpdateMixedCase (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.testValidTxnsBookkeeping (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.updateDeletePartitioned (batchId=330) org.apache.hadoop.hive.ql.TestTxnCommands2.writeBetweenWorkerAndCleaner (batchId=330) {noformat} Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/18135/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/18135/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-18135/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.YetusPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 37 tests failed {noformat} This message is automatically generated. ATTACHMENT ID: 12975412 - PreCommit-HIVE-Build > HiveRelDecorrelator fails with IndexOutOfBoundsException if the query > contains several "constant" columns > --------------------------------------------------------------------------------------------------------- > > Key: HIVE-22031 > URL: https://issues.apache.org/jira/browse/HIVE-22031 > Project: Hive > Issue Type: Bug > Components: CBO > Affects Versions: 2.3.5 > Reporter: Artem Velykorodnyi > Assignee: Artem Velykorodnyi > Priority: Major > Attachments: HIVE-22031.02.patch, HIVE-22031.1.patch, HIVE-22031.patch > > > Steps for reproducing: > {code} > 1. Create table orders > create table orders (ORD_NUM INT, CUST_CODE STRING); > 2. Create table customers > create table customers (CUST_CODE STRING); > 3. Make select with constants and with a subquery: > select DISTINCT(CUST_CODE), '777' as ANY, ORD_NUM, '888' as CONSTANT > from orders > WHERE not exists > (select 1 > from customers > WHERE CUST_CODE=orders.CUST_CODE > ); > {code} > Query fails with IndexOutOfBoundsException > {code} > Exception in thread "main" java.lang.AssertionError: Internal error: While > invoking method 'public > org.apache.hadoop.hive.ql.optimizer.calcite.rules.HiveRelDecorrelator$Frame > org.apache.hadoop.hive.ql.optimizer.calcite.rules.HiveRelDecorrelator.decorrelateRel(org.apache.hadoop.hive.ql.optimizer.calcite.reloperators.HiveProject) > throws org.apache.hadoop.hive.ql.parse.SemanticException' > at org.apache.calcite.util.Util.newInternal(Util.java:792) > at org.apache.calcite.util.ReflectUtil$2.invoke(ReflectUtil.java:534) > at > org.apache.hadoop.hive.ql.optimizer.calcite.rules.HiveRelDecorrelator.getInvoke(HiveRelDecorrelator.java:660) > at > org.apache.hadoop.hive.ql.optimizer.calcite.rules.HiveRelDecorrelator.decorrelate(HiveRelDecorrelator.java:252) > at > org.apache.hadoop.hive.ql.optimizer.calcite.rules.HiveRelDecorrelator.decorrelateQuery(HiveRelDecorrelator.java:218) > at > org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.apply(CalcitePlanner.java:1347) > at > org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.apply(CalcitePlanner.java:1261) > at org.apache.calcite.tools.Frameworks$1.apply(Frameworks.java:113) > at > org.apache.calcite.prepare.CalcitePrepareImpl.perform(CalcitePrepareImpl.java:997) > at org.apache.calcite.tools.Frameworks.withPrepare(Frameworks.java:149) > at org.apache.calcite.tools.Frameworks.withPlanner(Frameworks.java:106) > at > org.apache.hadoop.hive.ql.parse.CalcitePlanner.logicalPlan(CalcitePlanner.java:1069) > at > org.apache.hadoop.hive.ql.parse.CalcitePlanner.getOptimizedAST(CalcitePlanner.java:1085) > at > org.apache.hadoop.hive.ql.parse.CalcitePlanner.genOPTree(CalcitePlanner.java:364) > at > org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:11138) > at > org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:286) > at > org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:258) > at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:512) > at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1317) > at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1457) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227) > at > org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233) > at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184) > at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403) > at > org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821) > at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) > at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at org.apache.hadoop.util.RunJar.run(RunJar.java:233) > at org.apache.hadoop.util.RunJar.main(RunJar.java:148) > Caused by: java.lang.reflect.InvocationTargetException > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at org.apache.calcite.util.ReflectUtil$2.invoke(ReflectUtil.java:531) > ... 32 more > Caused by: java.lang.AssertionError: Internal error: While invoking method > 'public > org.apache.hadoop.hive.ql.optimizer.calcite.rules.HiveRelDecorrelator$Frame > org.apache.hadoop.hive.ql.optimizer.calcite.rules.HiveRelDecorrelator.decorrelateRel(org.apache.hadoop.hive.ql.optimizer.calcite.reloperators.HiveAggregate) > throws org.apache.hadoop.hive.ql.parse.SemanticException' > at org.apache.calcite.util.Util.newInternal(Util.java:792) > at org.apache.calcite.util.ReflectUtil$2.invoke(ReflectUtil.java:534) > at > org.apache.hadoop.hive.ql.optimizer.calcite.rules.HiveRelDecorrelator.getInvoke(HiveRelDecorrelator.java:660) > at > org.apache.hadoop.hive.ql.optimizer.calcite.rules.HiveRelDecorrelator.decorrelateRel(HiveRelDecorrelator.java:854) > ... 37 more > Caused by: java.lang.reflect.InvocationTargetException > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at org.apache.calcite.util.ReflectUtil$2.invoke(ReflectUtil.java:531) > ... 39 more > Caused by: java.lang.IndexOutOfBoundsException: Index: 3, Size: 2 > at java.util.ArrayList.rangeCheckForAdd(ArrayList.java:665) > at java.util.ArrayList.add(ArrayList.java:477) > at > org.apache.hadoop.hive.ql.optimizer.calcite.rules.HiveRelDecorrelator.decorrelateRel(HiveRelDecorrelator.java:833) > ... 44 more > {code} > HiveRelDecorrelator looking for omitted constants and put them into TreeMap > where keys are the numbers of columns in top-level select query. > For query from example TreeMap contains: > {code} > 0 = {TreeMap$Entry@8389} "1" -> "_UTF-16LE'777'" > 1 = {TreeMap$Entry@8390} "3" -> "_UTF-16LE'888'" > {code} > After that, there is step where List of fields is combined with contsants > from TreeMap > {code} > if (!omittedConstants.isEmpty()) { > final List<RexNode> postProjects = new ArrayList<>(relBuilder.fields()); > for (Map.Entry<Integer, RexLiteral> entry > : omittedConstants.descendingMap().entrySet()) { > postProjects.add(entry.getKey() + frame.corDefOutputs.size(), > entry.getValue()); > } > relBuilder.project(postProjects); > } > {code} > But TreeMap is descending, so firstly goes constant columns with high > position number, greater than target List size. > (For query from example there is an attempt to add an element to the List > with index 3, but the size of List is only 2). > If we use TreeMap without descending - everything goes as expected. Also, > there is no difference between descending and ascending map, because the List > is filled using indexes but not sequential position. > "Q file" with the query from the example works fine but fails on the Hive > 2.3.5. -- This message was sent by Atlassian JIRA (v7.6.14#76016)