[ https://issues.apache.org/jira/browse/HIVE-8649?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14201151#comment-14201151 ]
Xuefu Zhang commented on HIVE-8649: ----------------------------------- +1 > Increase level of parallelism in reduce phase [Spark Branch] > ------------------------------------------------------------ > > Key: HIVE-8649 > URL: https://issues.apache.org/jira/browse/HIVE-8649 > Project: Hive > Issue Type: Sub-task > Components: Spark > Reporter: Brock Noland > Assignee: Jimmy Xiang > Fix For: spark-branch > > Attachments: HIVE-8649.1-spark.patch, HIVE-8649.2-spark.patch > > > We calculate the number of reducers based on the same code for MapReduce. > However, reducers are vastly cheaper in Spark and it's generally recommended > we have many more reducers than in MR. > Sandy Ryza who works on Spark has some ideas about a heuristic. -- This message was sent by Atlassian JIRA (v6.3.4#6332)