One correction, the better way is to just create a file called java-opts in
.../spark/conf with the following config value in it  -Dhdp.version=<version
of HDP>.

One way to get the HDP version is to run the below one lines on a node of
your HDP cluster.

hdp-select status hadoop-client | sed 's/hadoop-client - \(.*\)/\1/'


You can also specify the same value using SPARK_JAVA_OPTS, i.e. export
SPARK_JAVA_OPTS="-Dhdp.version=2.2.5.0-2644" 
add the following options to spark-defaults.conf: 
spark.driver.extraJavaOptions     -Dhdp.version=2.2.5.0-2644 
spark.yarn.am.extraJavaOptions -Dhdp.version=2.2.5.0-2644




--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-1-6-0-RC3-tp15660p15701.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to