[ 
https://issues.apache.org/jira/browse/HIVE-7436?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chengxiang Li updated HIVE-7436:
--------------------------------

    Description: 
load Spark configuration into Hive driver, there are 3 ways to setup spark 
configurations:
#  Java property.
#  Configure properties in spark configuration file(spark-defaults.conf).
#  Hive configuration file(hive-site.xml).

The below configuration has more priority, and would overwrite previous 
configuration with the same property name.

Please refer to [http://spark.apache.org/docs/latest/configuration.html] for 
all configurable properties of spark, and you can configure spark configuration 
in Hive through following ways:
# Configure through spark configuration file.
#* Create spark-defaults.conf, and place it in the /etc/spark/conf 
configuration directory. configure properties in spark-defaults.conf in java 
properties format.
#* Create the $SPARK_CONF_DIR environment variable and set it to the location 
of spark-defaults.conf.
    export SPARK_CONF_DIR=/etc/spark/conf
#* Add $SAPRK_CONF_DIR to the $HADOOP_CLASSPATH environment variable.
    export HADOOP_CLASSPATH=$SPARK_CONF_DIR:$HADOOP_CLASSPATH
# Configure through hive configuration file.
#* edit hive-site.xml in hive conf directory, configure properties in 
spark-defaults.conf in xml format.

Hive driver default spark properties:
||name||default value||description||
|spark.master|local|Spark master url.|
|spark.app.name|Hive on Spark|Default Spark application name.|

NO PRECOMMIT TESTS. This is for spark-branch only.

  was:
load Spark configuration into Hive driver, there are 3 ways to setup spark 
configurations:
#  Configure properties in spark configuration file(spark-defaults.conf).
#  Java property.
#  System environment.
Spark support configuration through system environment just for compatible with 
previous scripts, we won't support in Hive on Spark. Hive on Spark load 
defaults from java properties, then load properties from configuration file, 
and override existed properties.

configuration steps:
# Create spark-defaults.conf, and place it in the /etc/spark/conf configuration 
directory.
    please refer to [http://spark.apache.org/docs/latest/configuration.html] 
for configuration of spark-defaults.conf.
# Create the $SPARK_CONF_DIR environment variable and set it to the location of 
spark-defaults.conf.
    export SPARK_CONF_DIR=/etc/spark/conf
# Add $SAPRK_CONF_DIR to the $HADOOP_CLASSPATH environment variable.
    export HADOOP_CLASSPATH=$SPARK_CONF_DIR:$HADOOP_CLASSPATH

NO PRECOMMIT TESTS. This is for spark-branch only.


> Load Spark configuration into Hive driver
> -----------------------------------------
>
>                 Key: HIVE-7436
>                 URL: https://issues.apache.org/jira/browse/HIVE-7436
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>            Reporter: Chengxiang Li
>            Assignee: Chengxiang Li
>         Attachments: HIVE-7436-Spark.1.patch, HIVE-7436-Spark.2.patch
>
>
> load Spark configuration into Hive driver, there are 3 ways to setup spark 
> configurations:
> #  Java property.
> #  Configure properties in spark configuration file(spark-defaults.conf).
> #  Hive configuration file(hive-site.xml).
> The below configuration has more priority, and would overwrite previous 
> configuration with the same property name.
> Please refer to [http://spark.apache.org/docs/latest/configuration.html] for 
> all configurable properties of spark, and you can configure spark 
> configuration in Hive through following ways:
> # Configure through spark configuration file.
> #* Create spark-defaults.conf, and place it in the /etc/spark/conf 
> configuration directory. configure properties in spark-defaults.conf in java 
> properties format.
> #* Create the $SPARK_CONF_DIR environment variable and set it to the location 
> of spark-defaults.conf.
>     export SPARK_CONF_DIR=/etc/spark/conf
> #* Add $SAPRK_CONF_DIR to the $HADOOP_CLASSPATH environment variable.
>     export HADOOP_CLASSPATH=$SPARK_CONF_DIR:$HADOOP_CLASSPATH
> # Configure through hive configuration file.
> #* edit hive-site.xml in hive conf directory, configure properties in 
> spark-defaults.conf in xml format.
> Hive driver default spark properties:
> ||name||default value||description||
> |spark.master|local|Spark master url.|
> |spark.app.name|Hive on Spark|Default Spark application name.|
> NO PRECOMMIT TESTS. This is for spark-branch only.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to