[ https://issues.apache.org/jira/browse/HIVE-16395?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16203100#comment-16203100 ]
Hive QA commented on HIVE-16395: -------------------------------- Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12891792/HIVE-16395.2.patch {color:green}SUCCESS:{color} +1 due to 1 test(s) being added or modified. {color:red}ERROR:{color} -1 due to 7 failed/errored test(s), 11223 tests executed *Failed tests:* {noformat} org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[optimize_nullscan] (batchId=162) org.apache.hadoop.hive.cli.TestSparkPerfCliDriver.testCliDriver[query16] (batchId=241) org.apache.hadoop.hive.cli.TestSparkPerfCliDriver.testCliDriver[query94] (batchId=241) org.apache.hadoop.hive.cli.TestTezPerfCliDriver.testCliDriver[query14] (batchId=239) org.apache.hadoop.hive.cli.TestTezPerfCliDriver.testCliDriver[query16] (batchId=239) org.apache.hadoop.hive.cli.TestTezPerfCliDriver.testCliDriver[query94] (batchId=239) org.apache.hadoop.hive.cli.control.TestDanglingQOuts.checkDanglingQOut (batchId=202) {noformat} Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/7266/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/7266/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-7266/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 7 tests failed {noformat} This message is automatically generated. ATTACHMENT ID: 12891792 - PreCommit-HIVE-Build > ConcurrentModificationException on config object in HoS > ------------------------------------------------------- > > Key: HIVE-16395 > URL: https://issues.apache.org/jira/browse/HIVE-16395 > Project: Hive > Issue Type: Task > Components: Spark > Reporter: Sahil Takiar > Assignee: Andrew Sherman > Attachments: HIVE-16395.1.patch, HIVE-16395.2.patch > > > Looks like this is happening inside spark executors, looks to be some race > condition when modifying {{Configuration}} objects. > Stack-Trace: > {code} > java.io.IOException: java.lang.reflect.InvocationTargetException > at > org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97) > at > org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:267) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.<init>(HadoopShimsSecure.java:213) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileInputFormatShim.getRecordReader(HadoopShimsSecure.java:334) > at > org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:682) > at org.apache.spark.rdd.HadoopRDD$$anon$1.<init>(HadoopRDD.scala:240) > at org.apache.spark.rdd.HadoopRDD.compute(HadoopRDD.scala:211) > at org.apache.spark.rdd.HadoopRDD.compute(HadoopRDD.scala:101) > at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) > at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) > at > org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) > at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) > at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) > at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:87) > at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) > at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) > at > org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73) > at > org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41) > at org.apache.spark.scheduler.Task.run(Task.scala:89) > at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:242) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > at java.lang.Thread.run(Thread.java:745) > Caused by: java.lang.reflect.InvocationTargetException > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:526) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:253) > ... 21 more > Caused by: java.util.ConcurrentModificationException > at java.util.Hashtable$Enumerator.next(Hashtable.java:1167) > at > org.apache.hadoop.conf.Configuration.iterator(Configuration.java:2455) > at > org.apache.hadoop.fs.s3a.S3AUtils.propagateBucketOptions(S3AUtils.java:716) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:181) > at > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2815) > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:98) > at > org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2852) > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2834) > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:387) > at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296) > at > org.apache.hadoop.mapred.LineRecordReader.<init>(LineRecordReader.java:108) > at > org.apache.hadoop.mapred.TextInputFormat.getRecordReader(TextInputFormat.java:67) > at > org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.<init>(CombineHiveRecordReader.java:68) > ... 26 more > {code} -- This message was sent by Atlassian JIRA (v6.4.14#64029)