[ https://issues.apache.org/jira/browse/HIVE-16362?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17087743#comment-17087743 ]
David Mollitor commented on HIVE-16362: --------------------------------------- [~mgergely] Is this still an issue? I know you've been looking at HoS tests. > Flaky test: TestMiniLlapLocalCliDriver.testCliDriver[vector_count_distinct] > --------------------------------------------------------------------------- > > Key: HIVE-16362 > URL: https://issues.apache.org/jira/browse/HIVE-16362 > Project: Hive > Issue Type: Sub-task > Reporter: Sahil Takiar > Priority: Major > > Jenkins Link: > https://builds.apache.org/job/PreCommit-HIVE-Build/4520/testReport/org.apache.hadoop.hive.cli/TestMiniLlapLocalCliDriver/testCliDriver_vector_count_distinct_/ > I see this in the {{hive.log}} file: > {code} > 2017-04-03T13:23:19,766 DEBUG [c11b6d76-6e01-4371-88fd-b89c9e420bb4 main] > metadata.Hive: Cancelling 30 dynamic loading tasks > 2017-04-03T13:23:19,768 ERROR [c11b6d76-6e01-4371-88fd-b89c9e420bb4 main] > exec.Task: Failed with exception Exception when loading 30 in table web_sales > with > loadPath=file:/home/hiveptest/35.184.244.143-hiveptest-0/apache-github-source-source/itests/qtest/target/localfs/warehouse/web_sales/.hive-staging_hive_2017-04-03_13-23-18_143_3140419982719291367-1/-ext-10000 > org.apache.hadoop.hive.ql.metadata.HiveException: Exception when loading 30 > in table web_sales with > loadPath=file:/home/hiveptest/35.184.244.143-hiveptest-0/apache-github-source-source/itests/qtest/target/localfs/warehouse/web_sales/.hive-staging_hive_2017-04-03_13-23-18_143_3140419982719291367-1/-ext-10000 > at > org.apache.hadoop.hive.ql.metadata.Hive.loadDynamicPartitions(Hive.java:1963) > at org.apache.hadoop.hive.ql.exec.MoveTask.execute(MoveTask.java:432) > at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199) > at > org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) > at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2184) > at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1840) > at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1527) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1236) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1226) > at > org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233) > at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184) > at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403) > at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:336) > at > org.apache.hadoop.hive.ql.QTestUtil.executeClientInternal(QTestUtil.java:1338) > at > org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:1312) > at > org.apache.hadoop.hive.cli.control.CoreCliDriver.runTest(CoreCliDriver.java:173) > at > org.apache.hadoop.hive.cli.control.CliAdapter.runTest(CliAdapter.java:104) > at > org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver(TestMiniLlapLocalCliDriver.java:59) > at sun.reflect.GeneratedMethodAccessor188.invoke(Unknown Source) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at > org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47) > at > org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) > at > org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44) > at > org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) > at > org.apache.hadoop.hive.cli.control.CliAdapter$2$1.evaluate(CliAdapter.java:92) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271) > at > org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70) > at > org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50) > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238) > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63) > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236) > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53) > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229) > at org.junit.runners.ParentRunner.run(ParentRunner.java:309) > at org.junit.runners.Suite.runChild(Suite.java:127) > at org.junit.runners.Suite.runChild(Suite.java:26) > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238) > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63) > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236) > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53) > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229) > at > org.apache.hadoop.hive.cli.control.CliAdapter$1$1.evaluate(CliAdapter.java:73) > at org.junit.rules.RunRules.evaluate(RunRules.java:20) > at org.junit.runners.ParentRunner.run(ParentRunner.java:309) > at > org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:283) > at > org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:173) > at > org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153) > at > org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:128) > at > org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:203) > at > org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:155) > at > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103) > Caused by: java.util.concurrent.ExecutionException: > org.apache.hadoop.hive.ql.metadata.HiveException: > org.datanucleus.exceptions.NucleusDataStoreException: Iteration request > failed : SELECT A0.COMMENT,A0."COLUMN_NAME",A0.TYPE_NAME,A0.INTEGER_IDX AS > NUCORDER0 FROM COLUMNS_V2 A0 WHERE A0.CD_ID = ? AND A0.INTEGER_IDX >= 0 ORDER > BY NUCORDER0 > at java.util.concurrent.FutureTask.report(FutureTask.java:122) > at java.util.concurrent.FutureTask.get(FutureTask.java:192) > at > org.apache.hadoop.hive.ql.metadata.Hive.loadDynamicPartitions(Hive.java:1954) > ... 52 more > Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: > org.datanucleus.exceptions.NucleusDataStoreException: Iteration request > failed : SELECT A0.COMMENT,A0."COLUMN_NAME",A0.TYPE_NAME,A0.INTEGER_IDX AS > NUCORDER0 FROM COLUMNS_V2 A0 WHERE A0.CD_ID = ? AND A0.INTEGER_IDX >= 0 ORDER > BY NUCORDER0 > at org.apache.hadoop.hive.ql.metadata.Hive.getPartition(Hive.java:2249) > at org.apache.hadoop.hive.ql.metadata.Hive.getPartition(Hive.java:2181) > at org.apache.hadoop.hive.ql.metadata.Hive.loadPartition(Hive.java:1611) > at org.apache.hadoop.hive.ql.metadata.Hive$3.call(Hive.java:1922) > at org.apache.hadoop.hive.ql.metadata.Hive$3.call(Hive.java:1913) > at java.util.concurrent.FutureTask.run(FutureTask.java:266) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > at java.lang.Thread.run(Thread.java:745) > Caused by: org.datanucleus.exceptions.NucleusDataStoreException: Iteration > request failed : SELECT > A0.COMMENT,A0."COLUMN_NAME",A0.TYPE_NAME,A0.INTEGER_IDX AS NUCORDER0 FROM > COLUMNS_V2 A0 WHERE A0.CD_ID = ? AND A0.INTEGER_IDX >= 0 ORDER BY NUCORDER0 > at > org.datanucleus.store.rdbms.scostore.JoinListStore.listIterator(JoinListStore.java:806) > at > org.datanucleus.store.rdbms.scostore.AbstractListStore.listIterator(AbstractListStore.java:93) > at > org.datanucleus.store.rdbms.scostore.AbstractListStore.iterator(AbstractListStore.java:83) > at > org.datanucleus.store.types.wrappers.backed.List.loadFromStore(List.java:264) > at > org.datanucleus.store.types.wrappers.backed.List.iterator(List.java:492) > at > org.apache.hadoop.hive.metastore.ObjectStore.convertToFieldSchemas(ObjectStore.java:1555) > at > org.apache.hadoop.hive.metastore.ObjectStore.convertToStorageDescriptor(ObjectStore.java:1622) > at > org.apache.hadoop.hive.metastore.ObjectStore.convertToStorageDescriptor(ObjectStore.java:1637) > at > org.apache.hadoop.hive.metastore.ObjectStore.convertToTable(ObjectStore.java:1494) > at > org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:1188) > at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at > org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:101) > at com.sun.proxy.$Proxy46.getTable(Unknown Source) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.fireReadTablePreEvent(HiveMetaStore.java:3371) > at > org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_partition_with_auth(HiveMetaStore.java:3387) > at sun.reflect.GeneratedMethodAccessor77.invoke(Unknown Source) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at > org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:148) > at > org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107) > at com.sun.proxy.$Proxy47.get_partition_with_auth(Unknown Source) > at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getPartitionWithAuthInfo(HiveMetaStoreClient.java:1307) > at sun.reflect.GeneratedMethodAccessor76.invoke(Unknown Source) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at > org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:173) > at com.sun.proxy.$Proxy48.getPartitionWithAuthInfo(Unknown Source) > at org.apache.hadoop.hive.ql.metadata.Hive.getPartition(Hive.java:2240) > ... 8 more > Caused by: java.sql.SQLException: Java exception: > 'org.apache.derby.iapi.services.io.FormatableBitSet cannot be cast to > org.apache.derby.iapi.store.access.StaticCompiledOpenConglomInfo: > java.lang.ClassCastException'. > at > org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown > Source) > at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source) > at org.apache.derby.impl.jdbc.Util.javaException(Unknown Source) > at > org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown > Source) > at > org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown > Source) > at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown > Source) > at org.apache.derby.impl.jdbc.ConnectionChild.handleException(Unknown > Source) > at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown > Source) > at > org.apache.derby.impl.jdbc.EmbedPreparedStatement.executeStatement(Unknown > Source) > at > org.apache.derby.impl.jdbc.EmbedPreparedStatement.executeQuery(Unknown Source) > at > com.jolbox.bonecp.PreparedStatementHandle.executeQuery(PreparedStatementHandle.java:174) > at > org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeQuery(ParamLoggingPreparedStatement.java:375) > at > org.datanucleus.store.rdbms.SQLController.executeStatementQuery(SQLController.java:552) > at > org.datanucleus.store.rdbms.scostore.JoinListStore.listIterator(JoinListStore.java:770) > ... 37 more > Caused by: java.sql.SQLException: Java exception: > 'org.apache.derby.iapi.services.io.FormatableBitSet cannot be cast to > org.apache.derby.iapi.store.access.StaticCompiledOpenConglomInfo: > java.lang.ClassCastException'. > at > org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source) > at > org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown > Source) > ... 51 more > Caused by: java.lang.ClassCastException: > org.apache.derby.iapi.services.io.FormatableBitSet cannot be cast to > org.apache.derby.iapi.store.access.StaticCompiledOpenConglomInfo > at > org.apache.derby.impl.sql.execute.GenericResultSetFactory.getBulkTableScanResultSet(Unknown > Source) > at > org.apache.derby.exe.ac1d3a42f2x015bx3579xc145x000022a319301bc.createResultSet(Unknown > Source) > at > org.apache.derby.impl.sql.execute.CursorActivation.decorateResultSet(Unknown > Source) > at org.apache.derby.impl.sql.execute.BaseActivation.execute(Unknown > Source) > at org.apache.derby.impl.sql.GenericActivationHolder.execute(Unknown > Source) > at > org.apache.derby.impl.sql.GenericPreparedStatement.executeStmt(Unknown Source) > at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown > Source) > ... 44 more > 2017-04-03T13:23:19,768 ERROR [c11b6d76-6e01-4371-88fd-b89c9e420bb4 main] > ql.Driver: FAILED: Execution Error, return code 1 from > org.apache.hadoop.hive.ql.exec.MoveTask. Exception when loading 30 in table > web_sales with > loadPath=file:/home/hiveptest/35.184.244.143-hiveptest-0/apache-github-source-source/itests/qtest/target/localfs/warehouse/web_sales/.hive-staging_hive_2017-04-03_13-23-18_143_3140419982719291367-1/-ext-10000 > 2017-04-03T13:23:19,768 DEBUG [c11b6d76-6e01-4371-88fd-b89c9e420bb4 main] > ql.Driver: Shutting down query > insert overwrite table web_sales > partition (ws_web_site_sk) > select ws_sold_date_sk, ws_sold_time_sk, ws_ship_date_sk, ws_item_sk, > ws_bill_customer_sk, ws_bill_cdemo_sk, ws_bill_hdemo_sk, > ws_bill_addr_sk, > ws_ship_customer_sk, ws_ship_cdemo_sk, ws_ship_hdemo_sk, > ws_ship_addr_sk, > ws_web_page_sk, ws_ship_mode_sk, ws_warehouse_sk, ws_promo_sk, > ws_order_number, > ws_quantity, ws_wholesale_cost, ws_list_price, ws_sales_price, > ws_ext_discount_amt, > ws_ext_sales_price, ws_ext_wholesale_cost, ws_ext_list_price, > ws_ext_tax, > ws_coupon_amt, ws_ext_ship_cost, ws_net_paid, ws_net_paid_inc_tax, > ws_net_paid_inc_ship, > ws_net_paid_inc_ship_tax, ws_net_profit, ws_web_site_sk from > web_sales_txt > 2017-04-03T13:23:19,769 DEBUG [c11b6d76-6e01-4371-88fd-b89c9e420bb4 main] > log.PerfLogger: </PERFLOG method=Driver.execute start=1491250998223 > end=1491250999769 duration=1546 from=org.apache.hadoop.hive.ql.Driver> > 2017-04-03T13:23:19,769 INFO [c11b6d76-6e01-4371-88fd-b89c9e420bb4 main] > metadata.Hive: Dumping metastore api call timing information for : execution > phase > 2017-04-03T13:23:19,769 DEBUG [c11b6d76-6e01-4371-88fd-b89c9e420bb4 main] > metadata.Hive: Total time spent in each metastore function (ms): > {add_partition_(Partition, )=514, getTable_(String, String, )=79} > 2017-04-03T13:23:19,769 INFO [c11b6d76-6e01-4371-88fd-b89c9e420bb4 main] > ql.Driver: Completed executing > command(queryId=hiveptest_20170403132318_e3e4b6e2-2020-48df-9b24-6e96ad3b90db); > Time taken: 1.546 seconds > 2017-04-03T13:23:19,769 DEBUG [c11b6d76-6e01-4371-88fd-b89c9e420bb4 main] > log.PerfLogger: <PERFLOG method=releaseLocks > from=org.apache.hadoop.hive.ql.Driver> > 2017-04-03T13:23:19,769 DEBUG [c11b6d76-6e01-4371-88fd-b89c9e420bb4 main] > log.PerfLogger: </PERFLOG method=releaseLocks start=1491250999769 > end=1491250999769 duration=0 from=org.apache.hadoop.hive.ql.Driver> > 2017-04-03T13:23:19,769 DEBUG [c11b6d76-6e01-4371-88fd-b89c9e420bb4 main] > ql.Driver: Shutting down query > insert overwrite table web_sales > partition (ws_web_site_sk) > select ws_sold_date_sk, ws_sold_time_sk, ws_ship_date_sk, ws_item_sk, > ws_bill_customer_sk, ws_bill_cdemo_sk, ws_bill_hdemo_sk, > ws_bill_addr_sk, > ws_ship_customer_sk, ws_ship_cdemo_sk, ws_ship_hdemo_sk, > ws_ship_addr_sk, > ws_web_page_sk, ws_ship_mode_sk, ws_warehouse_sk, ws_promo_sk, > ws_order_number, > ws_quantity, ws_wholesale_cost, ws_list_price, ws_sales_price, > ws_ext_discount_amt, > ws_ext_sales_price, ws_ext_wholesale_cost, ws_ext_list_price, > ws_ext_tax, > ws_coupon_amt, ws_ext_ship_cost, ws_net_paid, ws_net_paid_inc_tax, > ws_net_paid_inc_ship, > ws_net_paid_inc_ship_tax, ws_net_profit, ws_web_site_sk from > web_sales_txt > 2017-04-03T13:23:19,776 DEBUG [c11b6d76-6e01-4371-88fd-b89c9e420bb4 main] > ql.Context: Deleting scratch dir: > file:/home/hiveptest/35.184.244.143-hiveptest-0/apache-github-source-source/itests/qtest/target/localfs/warehouse/web_sales/.hive-staging_hive_2017-04-03_13-23-18_143_3140419982719291367-1 > 2017-04-03T13:23:19,776 DEBUG [c11b6d76-6e01-4371-88fd-b89c9e420bb4 main] > ql.Context: Deleting scratch dir: > file:/home/hiveptest/35.184.244.143-hiveptest-0/apache-github-source-source/itests/qtest/target/tmp/localscratchdir/c11b6d76-6e01-4371-88fd-b89c9e420bb4/hive_2017-04-03_13-23-18_143_3140419982719291367-1 > 2017-04-03T13:23:19,777 INFO [c11b6d76-6e01-4371-88fd-b89c9e420bb4 main] > conf.HiveConf: Using the default value passed in for log id: > c11b6d76-6e01-4371-88fd-b89c9e420bb4 > 2017-04-03T13:23:19,777 INFO [c11b6d76-6e01-4371-88fd-b89c9e420bb4 main] > session.SessionState: Resetting thread name to main > 2017-04-03T13:23:19,777 ERROR [main] QTestUtil: Client execution failed with > error code = 1 running " > insert overwrite table web_sales > partition (ws_web_site_sk) > select ws_sold_date_sk, ws_sold_time_sk, ws_ship_date_sk, ws_item_sk, > ws_bill_customer_sk, ws_bill_cdemo_sk, ws_bill_hdemo_sk, > ws_bill_addr_sk, > ws_ship_customer_sk, ws_ship_cdemo_sk, ws_ship_hdemo_sk, > ws_ship_addr_sk, > ws_web_page_sk, ws_ship_mode_sk, ws_warehouse_sk, ws_promo_sk, > ws_order_number, > ws_quantity, ws_wholesale_cost, ws_list_price, ws_sales_price, > ws_ext_discount_amt, > ws_ext_sales_price, ws_ext_wholesale_cost, ws_ext_list_price, > ws_ext_tax, > ws_coupon_amt, ws_ext_ship_cost, ws_net_paid, ws_net_paid_inc_tax, > ws_net_paid_inc_ship, > ws_net_paid_inc_ship_tax, ws_net_profit, ws_web_site_sk from > web_sales_txt" fname=vector_count_distinct.q > See ./ql/target/tmp/log/hive.log or ./itests/qtest/target/tmp/log/hive.log, > or check ./ql/target/surefire-reports or > ./itests/qtest/target/surefire-reports/ for specific test cases logs. > 2017-04-03T13:23:19,778 INFO [main] control.CoreCliDriver: Done > queryvector_count_distinct.q. succeeded=false, skipped=false. > ElapsedTime(ms)=2187 > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005)