[ 
https://issues.apache.org/jira/browse/HIVE-7436?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14069825#comment-14069825
 ] 

Chengxiang Li commented on HIVE-7436:
-------------------------------------

[~xuefuz] The configurations you list is hive configurations which can only be 
configured in hive-site.xml, 
[here|http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.1.2/bk_installing_manually_book/content/rpm-chap-tez-configure_hive_for_tez.html]
 is list for whole hive-tez configuration set. Besides these configurations, 
there are many tez specified configurations which are required to configure tez 
environment, such as 'tez.lib.uris', 
[here|http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.1.2/bk_installing_manually_book/content/rpm-chap-tez_configure_tez.html]
 is a full set of tez configurations. these tez specified configuration could 
be set both in tez-site.xml or hive-site.xml with the same property name, and 
are loaded in tez side from tez-site.xml or Configuration transformed from hive 
driver. The hive-tez configurations are different from tez-specified 
configurations and won't override any of them.

> Load Spark configuration into Hive driver
> -----------------------------------------
>
>                 Key: HIVE-7436
>                 URL: https://issues.apache.org/jira/browse/HIVE-7436
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>            Reporter: Chengxiang Li
>            Assignee: Chengxiang Li
>         Attachments: HIVE-7436-Spark.1.patch, HIVE-7436-Spark.2.patch
>
>
> load Spark configuration into Hive driver, there are 3 ways to setup spark 
> configurations:
> #  Configure properties in spark configuration file(spark-defaults.conf).
> #  Java property.
> #  System environment.
> Spark support configuration through system environment just for compatible 
> with previous scripts, we won't support in Hive on Spark. Hive on Spark load 
> defaults from java properties, then load properties from configuration file, 
> and override existed properties.
> configuration steps:
> # Create spark-defaults.conf, and place it in the /etc/spark/conf 
> configuration directory.
>     please refer to [http://spark.apache.org/docs/latest/configuration.html] 
> for configuration of spark-defaults.conf.
> # Create the $SPARK_CONF_DIR environment variable and set it to the location 
> of spark-defaults.conf.
>     export SPARK_CONF_DIR=/etc/spark/conf
> # Add $SAPRK_CONF_DIR to the $HADOOP_CLASSPATH environment variable.
>     export HADOOP_CLASSPATH=$SPARK_CONF_DIR:$HADOOP_CLASSPATH
> NO PRECOMMIT TESTS. This is for spark-branch only.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to