Hi,

I am running Spark 2.0.0 and 2.1.1 on YARN in a Hadoop 2.7.3 cluster. Is
spark-env.sh sourced when starting the Spark AM container or the executor
container?

Saw this paragraph on
https://github.com/apache/spark/blob/master/docs/configuration.md:

Note: When running Spark on YARN in cluster mode, environment variables
> need to be set using the spark.yarn.appMasterEnv.[EnvironmentVariableName] 
> property
> in your conf/spark-defaults.conf file. Environment variables that are set
> in spark-env.sh will not be reflected in the YARN Application Master
> process in clustermode. See the YARN-related Spark Properties
> <https://github.com/apache/spark/blob/master/docs/running-on-yarn.html#spark-properties>
>  for
> more information.


Does it mean spark-env.sh will not be sourced when starting AM in cluster
mode?
Does this paragraph appy to executor as well?

Thanks,
-- 
John Zhuge

Reply via email to