[ 
https://issues.apache.org/jira/browse/HIVE-10009?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jimmy Xiang updated HIVE-10009:
-------------------------------
    Fix Version/s: spark-branch

> LazyObjectInspectorFactory is not thread safe [Spark Branch]
> ------------------------------------------------------------
>
>                 Key: HIVE-10009
>                 URL: https://issues.apache.org/jira/browse/HIVE-10009
>             Project: Hive
>          Issue Type: Bug
>    Affects Versions: spark-branch
>            Reporter: Jimmy Xiang
>            Assignee: Jimmy Xiang
>             Fix For: spark-branch
>
>
> LazyObjectInspectorFactory is not thread safe, which causes random failures 
> in multiple thread environment such as Hive on Spark. We got exceptions like 
> below
> {noformat}
> java.lang.RuntimeException: Map operator initialization failed: 
> java.lang.ClassCastException: 
> org.apache.hadoop.hive.serde2.lazy.objectinspector.LazySimpleStructObjectInspector
>  cannot be cast to 
> org.apache.hadoop.hive.serde2.objectinspector.SettableStructObjectInspector
>       at 
> org.apache.hadoop.hive.ql.exec.spark.SparkMapRecordHandler.init(SparkMapRecordHandler.java:127)
>       at 
> org.apache.hadoop.hive.ql.exec.spark.HiveMapFunction.call(HiveMapFunction.java:55)
>       at 
> org.apache.hadoop.hive.ql.exec.spark.HiveMapFunction.call(HiveMapFunction.java:30)
>       at 
> org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$7$1.apply(JavaRDDLike.scala:170)
>       at 
> org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$7$1.apply(JavaRDDLike.scala:170)
>       at org.apache.spark.rdd.RDD$$anonfun$14.apply(RDD.scala:634)
>       at org.apache.spark.rdd.RDD$$anonfun$14.apply(RDD.scala:634)
>       at 
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)
>       at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277)
>       at org.apache.spark.rdd.RDD.iterator(RDD.scala:244)
>       at 
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68)
>       at 
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
>       at org.apache.spark.scheduler.Task.run(Task.scala:64)
>       at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>       at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.ClassCastException: 
> org.apache.hadoop.hive.serde2.lazy.objectinspector.LazySimpleStructObjectInspector
>  cannot be cast to 
> org.apache.hadoop.hive.serde2.objectinspector.SettableStructObjectInspector
>       at 
> org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorConverters.getConverter(ObjectInspectorConverters.java:154)
>       at 
> org.apache.hadoop.hive.ql.exec.MapOperator.initObjectInspector(MapOperator.java:199)
>       at 
> org.apache.hadoop.hive.ql.exec.MapOperator.setChildren(MapOperator.java:355)
>       at 
> org.apache.hadoop.hive.ql.exec.spark.SparkMapRecordHandler.init(SparkMapRecordHandler.java:92)
>       ... 16 more
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to