[ 
https://issues.apache.org/jira/browse/HIVE-9258?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jimmy Xiang reassigned HIVE-9258:
---------------------------------

    Assignee: Jimmy Xiang

> Explain query shouldn't launch a Spark application [Spark Branch]
> -----------------------------------------------------------------
>
>                 Key: HIVE-9258
>                 URL: https://issues.apache.org/jira/browse/HIVE-9258
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>            Reporter: Xuefu Zhang
>            Assignee: Jimmy Xiang
>
> Currently for Hive on Spark, query plan includes the number of reducers, 
> which is determined partly by the Spark cluster. Thus, explain query will 
> need to launch a Spark application (Spark remote context), which is costly. 
> To make things worse, the application is discarded right way.
> Ideally, we shouldn't launch a Spark application even for an explain query.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to