[ https://issues.apache.org/jira/browse/FLINK-32731?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17757780#comment-17757780 ]
Shengkai Fang commented on FLINK-32731: --------------------------------------- Thanks for the sharing. I think we should add some retry mechanism to restart the container when namenode fails. I will open a PR to fix this soon. > SqlGatewayE2ECase.testHiveServer2ExecuteStatement failed due to MetaException > ----------------------------------------------------------------------------- > > Key: FLINK-32731 > URL: https://issues.apache.org/jira/browse/FLINK-32731 > Project: Flink > Issue Type: Bug > Components: Table SQL / Gateway > Affects Versions: 1.18.0 > Reporter: Matthias Pohl > Assignee: Shengkai Fang > Priority: Major > Labels: pull-request-available, test-stability > > https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=51891&view=logs&j=fb37c667-81b7-5c22-dd91-846535e99a97&t=011e961e-597c-5c96-04fe-7941c8b83f23&l=10987 > {code} > Aug 02 02:14:04 02:14:04.957 [ERROR] Tests run: 3, Failures: 0, Errors: 1, > Skipped: 0, Time elapsed: 198.658 s <<< FAILURE! - in > org.apache.flink.table.gateway.SqlGatewayE2ECase > Aug 02 02:14:04 02:14:04.966 [ERROR] > org.apache.flink.table.gateway.SqlGatewayE2ECase.testHiveServer2ExecuteStatement > Time elapsed: 31.437 s <<< ERROR! > Aug 02 02:14:04 java.util.concurrent.ExecutionException: > Aug 02 02:14:04 java.sql.SQLException: > org.apache.flink.table.gateway.service.utils.SqlExecutionException: Failed to > execute the operation d440e6e7-0fed-49c9-933e-c7be5bbae50d. > Aug 02 02:14:04 at > org.apache.flink.table.gateway.service.operation.OperationManager$Operation.processThrowable(OperationManager.java:414) > Aug 02 02:14:04 at > org.apache.flink.table.gateway.service.operation.OperationManager$Operation.lambda$run$0(OperationManager.java:267) > Aug 02 02:14:04 at > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) > Aug 02 02:14:04 at > java.util.concurrent.FutureTask.run(FutureTask.java:266) > Aug 02 02:14:04 at > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) > Aug 02 02:14:04 at > java.util.concurrent.FutureTask.run(FutureTask.java:266) > Aug 02 02:14:04 at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) > Aug 02 02:14:04 at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) > Aug 02 02:14:04 at java.lang.Thread.run(Thread.java:750) > Aug 02 02:14:04 Caused by: org.apache.flink.table.api.TableException: Could > not execute CreateTable in path `hive`.`default`.`CsvTable` > Aug 02 02:14:04 at > org.apache.flink.table.catalog.CatalogManager.execute(CatalogManager.java:1289) > Aug 02 02:14:04 at > org.apache.flink.table.catalog.CatalogManager.createTable(CatalogManager.java:939) > Aug 02 02:14:04 at > org.apache.flink.table.operations.ddl.CreateTableOperation.execute(CreateTableOperation.java:84) > Aug 02 02:14:04 at > org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1080) > Aug 02 02:14:04 at > org.apache.flink.table.gateway.service.operation.OperationExecutor.callOperation(OperationExecutor.java:570) > Aug 02 02:14:04 at > org.apache.flink.table.gateway.service.operation.OperationExecutor.executeOperation(OperationExecutor.java:458) > Aug 02 02:14:04 at > org.apache.flink.table.gateway.service.operation.OperationExecutor.executeStatement(OperationExecutor.java:210) > Aug 02 02:14:04 at > org.apache.flink.table.gateway.service.SqlGatewayServiceImpl.lambda$executeStatement$1(SqlGatewayServiceImpl.java:212) > Aug 02 02:14:04 at > org.apache.flink.table.gateway.service.operation.OperationManager.lambda$submitOperation$1(OperationManager.java:119) > Aug 02 02:14:04 at > org.apache.flink.table.gateway.service.operation.OperationManager$Operation.lambda$run$0(OperationManager.java:258) > Aug 02 02:14:04 ... 7 more > Aug 02 02:14:04 Caused by: > org.apache.flink.table.catalog.exceptions.CatalogException: Failed to create > table default.CsvTable > Aug 02 02:14:04 at > org.apache.flink.table.catalog.hive.HiveCatalog.createTable(HiveCatalog.java:547) > Aug 02 02:14:04 at > org.apache.flink.table.catalog.CatalogManager.lambda$createTable$16(CatalogManager.java:950) > Aug 02 02:14:04 at > org.apache.flink.table.catalog.CatalogManager.execute(CatalogManager.java:1283) > Aug 02 02:14:04 ... 16 more > Aug 02 02:14:04 Caused by: MetaException(message:Got exception: > java.net.ConnectException Call From 70d5c7217fe8/172.17.0.2 to > hadoop-master:9000 failed on connection exception: java.net.ConnectException: > Connection refused; For more details see: > http://wiki.apache.org/hadoop/ConnectionRefused) > Aug 02 02:14:04 at > org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result$create_table_with_environment_context_resultStandardScheme.read(ThriftHiveMetastore.java:42225) > Aug 02 02:14:04 at > org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result$create_table_with_environment_context_resultStandardScheme.read(ThriftHiveMetastore.java:42193) > Aug 02 02:14:04 at > org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result.read(ThriftHiveMetastore.java:42119) > Aug 02 02:14:04 at > org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:86) > Aug 02 02:14:04 at > org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_create_table_with_environment_context(ThriftHiveMetastore.java:1203) > Aug 02 02:14:04 at > org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.create_table_with_environment_context(ThriftHiveMetastore.java:1189) > Aug 02 02:14:04 at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.create_table_with_environment_context(HiveMetaStoreClient.java:2396) > Aug 02 02:14:04 at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:750) > Aug 02 02:14:04 at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:738) > Aug 02 02:14:04 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native > Method) > Aug 02 02:14:04 at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > Aug 02 02:14:04 at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > Aug 02 02:14:04 at java.lang.reflect.Method.invoke(Method.java:498) > Aug 02 02:14:04 at > org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:169) > Aug 02 02:14:04 at com.sun.proxy.$Proxy26.createTable(Unknown Source) > Aug 02 02:14:04 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native > Method) > Aug 02 02:14:04 at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > Aug 02 02:14:04 at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > Aug 02 02:14:04 at java.lang.reflect.Method.invoke(Method.java:498) > Aug 02 02:14:04 at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient$SynchronizedHandler.invoke(HiveMetaStoreClient.java:2327) > Aug 02 02:14:04 at com.sun.proxy.$Proxy26.createTable(Unknown Source) > Aug 02 02:14:04 at > org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createTable(HiveMetastoreClientWrapper.java:174) > Aug 02 02:14:04 at > org.apache.flink.table.catalog.hive.HiveCatalog.createTable(HiveCatalog.java:539) > Aug 02 02:14:04 ... 18 more > Aug 02 02:14:04 > Aug 02 02:14:04 at > java.util.concurrent.FutureTask.report(FutureTask.java:122) > Aug 02 02:14:04 at > java.util.concurrent.FutureTask.get(FutureTask.java:206) > Aug 02 02:14:04 at > org.apache.flink.tests.util.flink.FlinkDistribution.submitSQL(FlinkDistribution.java:341) > Aug 02 02:14:04 at > org.apache.flink.tests.util.flink.FlinkDistribution.submitSQLJob(FlinkDistribution.java:281) > Aug 02 02:14:04 at > org.apache.flink.tests.util.flink.LocalStandaloneFlinkResource$GatewayClusterControllerImpl.submitSQLJob(LocalStandaloneFlinkResource.java:220) > Aug 02 02:14:04 at > org.apache.flink.table.gateway.SqlGatewayE2ECase.executeStatement(SqlGatewayE2ECase.java:133) > Aug 02 02:14:04 at > org.apache.flink.table.gateway.SqlGatewayE2ECase.testHiveServer2ExecuteStatement(SqlGatewayE2ECase.java:107) > Aug 02 02:14:04 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native > Method) > Aug 02 02:14:04 at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > Aug 02 02:14:04 at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > Aug 02 02:14:04 at java.lang.reflect.Method.invoke(Method.java:498) > Aug 02 02:14:04 at > org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59) > Aug 02 02:14:04 at > org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) > Aug 02 02:14:04 at > org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56) > Aug 02 02:14:04 at > org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) > Aug 02 02:14:04 at > org.apache.flink.util.ExternalResource$1.evaluate(ExternalResource.java:48) > Aug 02 02:14:04 at > org.apache.flink.util.TestNameProvider$1.evaluate(TestNameProvider.java:45) > Aug 02 02:14:04 at > org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:61) > Aug 02 02:14:04 at > org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) > Aug 02 02:14:04 at > org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100) > Aug 02 02:14:04 at > org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366) > Aug 02 02:14:04 at > org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103) > Aug 02 02:14:04 at > org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63) > Aug 02 02:14:04 at > org.junit.runners.ParentRunner$4.run(ParentRunner.java:331) > Aug 02 02:14:04 at > org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79) > Aug 02 02:14:04 at > org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329) > Aug 02 02:14:04 at > org.junit.runners.ParentRunner.access$100(ParentRunner.java:66) > Aug 02 02:14:04 at > org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293) > Aug 02 02:14:04 at > org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26) > Aug 02 02:14:04 at > org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27) > Aug 02 02:14:04 at > org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:54) > Aug 02 02:14:04 at > org.testcontainers.containers.FailureDetectingExternalResource$1.evaluate(FailureDetectingExternalResource.java:29) > Aug 02 02:14:04 at org.junit.rules.RunRules.evaluate(RunRules.java:20) > Aug 02 02:14:04 at > org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) > Aug 02 02:14:04 at > org.junit.runners.ParentRunner.run(ParentRunner.java:413) > Aug 02 02:14:04 at org.junit.runner.JUnitCore.run(JUnitCore.java:137) > Aug 02 02:14:04 at org.junit.runner.JUnitCore.run(JUnitCore.java:115) > Aug 02 02:14:04 at > org.junit.vintage.engine.execution.RunnerExecutor.execute(RunnerExecutor.java:42) > Aug 02 02:14:04 at > org.junit.vintage.engine.VintageTestEngine.executeAllChildren(VintageTestEngine.java:80) > Aug 02 02:14:04 at > org.junit.vintage.engine.VintageTestEngine.execute(VintageTestEngine.java:72) > Aug 02 02:14:04 at > org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:147) > Aug 02 02:14:04 at > org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:127) > Aug 02 02:14:04 at > org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:90) > Aug 02 02:14:04 at > org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:55) > Aug 02 02:14:04 at > org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:102) > Aug 02 02:14:04 at > org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:54) > Aug 02 02:14:04 at > org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:114) > Aug 02 02:14:04 at > org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:86) > Aug 02 02:14:04 at > org.junit.platform.launcher.core.DefaultLauncherSession$DelegatingLauncher.execute(DefaultLauncherSession.java:86) > Aug 02 02:14:04 at > org.junit.platform.launcher.core.SessionPerRequestLauncher.execute(SessionPerRequestLauncher.java:53) > Aug 02 02:14:04 at > org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.execute(JUnitPlatformProvider.java:188) > Aug 02 02:14:04 at > org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invokeAllTests(JUnitPlatformProvider.java:154) > Aug 02 02:14:04 at > org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invoke(JUnitPlatformProvider.java:128) > Aug 02 02:14:04 at > org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:428) > Aug 02 02:14:04 at > org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:162) > Aug 02 02:14:04 at > org.apache.maven.surefire.booter.ForkedBooter.run(ForkedBooter.java:562) > Aug 02 02:14:04 at > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:548) > Aug 02 02:14:04 Caused by: java.sql.SQLException: > org.apache.flink.table.gateway.service.utils.SqlExecutionException: Failed to > execute the operation d440e6e7-0fed-49c9-933e-c7be5bbae50d. > Aug 02 02:14:04 at > org.apache.flink.table.gateway.service.operation.OperationManager$Operation.processThrowable(OperationManager.java:414) > Aug 02 02:14:04 at > org.apache.flink.table.gateway.service.operation.OperationManager$Operation.lambda$run$0(OperationManager.java:267) > Aug 02 02:14:04 at > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) > Aug 02 02:14:04 at > java.util.concurrent.FutureTask.run(FutureTask.java:266) > Aug 02 02:14:04 at > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) > Aug 02 02:14:04 at > java.util.concurrent.FutureTask.run(FutureTask.java:266) > Aug 02 02:14:04 at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) > Aug 02 02:14:04 at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) > Aug 02 02:14:04 at java.lang.Thread.run(Thread.java:750) > Aug 02 02:14:04 Caused by: org.apache.flink.table.api.TableException: Could > not execute CreateTable in path `hive`.`default`.`CsvTable` > Aug 02 02:14:04 at > org.apache.flink.table.catalog.CatalogManager.execute(CatalogManager.java:1289) > Aug 02 02:14:04 at > org.apache.flink.table.catalog.CatalogManager.createTable(CatalogManager.java:939) > Aug 02 02:14:04 at > org.apache.flink.table.operations.ddl.CreateTableOperation.execute(CreateTableOperation.java:84) > Aug 02 02:14:04 at > org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1080) > Aug 02 02:14:04 at > org.apache.flink.table.gateway.service.operation.OperationExecutor.callOperation(OperationExecutor.java:570) > Aug 02 02:14:04 at > org.apache.flink.table.gateway.service.operation.OperationExecutor.executeOperation(OperationExecutor.java:458) > Aug 02 02:14:04 at > org.apache.flink.table.gateway.service.operation.OperationExecutor.executeStatement(OperationExecutor.java:210) > Aug 02 02:14:04 at > org.apache.flink.table.gateway.service.SqlGatewayServiceImpl.lambda$executeStatement$1(SqlGatewayServiceImpl.java:212) > Aug 02 02:14:04 at > org.apache.flink.table.gateway.service.operation.OperationManager.lambda$submitOperation$1(OperationManager.java:119) > Aug 02 02:14:04 at > org.apache.flink.table.gateway.service.operation.OperationManager$Operation.lambda$run$0(OperationManager.java:258) > Aug 02 02:14:04 ... 7 more > Aug 02 02:14:04 Caused by: > org.apache.flink.table.catalog.exceptions.CatalogException: Failed to create > table default.CsvTable > Aug 02 02:14:04 at > org.apache.flink.table.catalog.hive.HiveCatalog.createTable(HiveCatalog.java:547) > Aug 02 02:14:04 at > org.apache.flink.table.catalog.CatalogManager.lambda$createTable$16(CatalogManager.java:950) > Aug 02 02:14:04 at > org.apache.flink.table.catalog.CatalogManager.execute(CatalogManager.java:1283) > Aug 02 02:14:04 ... 16 more > Aug 02 02:14:04 Caused by: MetaException(message:Got exception: > java.net.ConnectException Call From 70d5c7217fe8/172.17.0.2 to > hadoop-master:9000 failed on connection exception: java.net.ConnectException: > Connection refused; For more details see: > http://wiki.apache.org/hadoop/ConnectionRefused) > Aug 02 02:14:04 at > org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result$create_table_with_environment_context_resultStandardScheme.read(ThriftHiveMetastore.java:42225) > Aug 02 02:14:04 at > org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result$create_table_with_environment_context_resultStandardScheme.read(ThriftHiveMetastore.java:42193) > Aug 02 02:14:04 at > org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result.read(ThriftHiveMetastore.java:42119) > Aug 02 02:14:04 at > org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:86) > Aug 02 02:14:04 at > org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_create_table_with_environment_context(ThriftHiveMetastore.java:1203) > Aug 02 02:14:04 at > org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.create_table_with_environment_context(ThriftHiveMetastore.java:1189) > Aug 02 02:14:04 at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.create_table_with_environment_context(HiveMetaStoreClient.java:2396) > Aug 02 02:14:04 at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:750) > Aug 02 02:14:04 at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:738) > Aug 02 02:14:04 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native > Method) > Aug 02 02:14:04 at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > Aug 02 02:14:04 at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > Aug 02 02:14:04 at java.lang.reflect.Method.invoke(Method.java:498) > Aug 02 02:14:04 at > org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:169) > Aug 02 02:14:04 at com.sun.proxy.$Proxy26.createTable(Unknown Source) > Aug 02 02:14:04 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native > Method) > Aug 02 02:14:04 at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > Aug 02 02:14:04 at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > Aug 02 02:14:04 at java.lang.reflect.Method.invoke(Method.java:498) > Aug 02 02:14:04 at > org.apache.hadoop.hive.metastore.HiveMetaStoreClient$SynchronizedHandler.invoke(HiveMetaStoreClient.java:2327) > Aug 02 02:14:04 at com.sun.proxy.$Proxy26.createTable(Unknown Source) > Aug 02 02:14:04 at > org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createTable(HiveMetastoreClientWrapper.java:174) > Aug 02 02:14:04 at > org.apache.flink.table.catalog.hive.HiveCatalog.createTable(HiveCatalog.java:539) > Aug 02 02:14:04 ... 18 more > Aug 02 02:14:04 > Aug 02 02:14:04 at > org.apache.hive.jdbc.HiveStatement.waitForOperationToComplete(HiveStatement.java:385) > Aug 02 02:14:04 at > org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:254) > Aug 02 02:14:04 at > org.apache.flink.tests.util.flink.FlinkDistribution.lambda$submitSQLJob$6(FlinkDistribution.java:293) > Aug 02 02:14:04 at > org.apache.flink.util.function.FunctionUtils.lambda$asCallable$5(FunctionUtils.java:126) > Aug 02 02:14:04 at > java.util.concurrent.FutureTask.run(FutureTask.java:266) > Aug 02 02:14:04 at java.lang.Thread.run(Thread.java:750) > {code} -- This message was sent by Atlassian Jira (v8.20.10#820010)