[ 
https://issues.apache.org/jira/browse/HIVE-15773?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16996958#comment-16996958
 ] 

bianqi edited comment on HIVE-15773 at 12/16/19 4:28 AM:
---------------------------------------------------------

[~dongjoon] When we use a single CPU to run the spark-sql program, there will 
be no exceptions, but if we use multiple CPU to run, it will report an error.


was (Author: bianqi):
When we use a single CPU to run the spark-sql program, there will be no 
exceptions, but if we use multiple CPU to run, it will report an error.

> HCatRecordObjectInspectorFactory is not thread safe
> ---------------------------------------------------
>
>                 Key: HIVE-15773
>                 URL: https://issues.apache.org/jira/browse/HIVE-15773
>             Project: Hive
>          Issue Type: Bug
>            Reporter: David Phillips
>            Priority: Major
>         Attachments: HIVE-15773.2.patch, HIVE-15773.3.patch, HIVE-15773.patch
>
>
> {{HashMap}} used without synchronization for the caches, which makes the code 
> unsafe for use in a multi-threaded environment such as Presto (or Spark?). 
> The simple fix is to switch them to {{ConcurrentHashMap}}.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to