[
https://issues.apache.org/jira/browse/HADOOP-19001?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17797918#comment-17797918
]
ASF GitHub Bot commented on HADOOP-19001:
-----------------------------------------
iwasakims commented on code in PR #6367:
URL: https://github.com/apache/hadoop/pull/6367#discussion_r1429147629
##########
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/resources/yarn-default.xml:
##########
@@ -1309,7 +1309,7 @@
<property>
<description>Environment variables that containers may override rather
than use NodeManager's default.</description>
<name>yarn.nodemanager.env-whitelist</name>
-
<value>JAVA_HOME,HADOOP_COMMON_HOME,HADOOP_HDFS_HOME,HADOOP_CONF_DIR,CLASSPATH_PREPEND_DISTCACHE,HADOOP_YARN_HOME,HADOOP_HOME,PATH,LANG,TZ</value>
+
<value>JAVA_HOME,HADOOP_COMMON_HOME,HADOOP_HDFS_HOME,HADOOP_CONF_DIR,CLASSPATH_PREPEND_DISTCACHE,HADOOP_YARN_HOME,HADOOP_HOME,PATH,LANG,TZ,LD_LIBRARY_PATH</value>
Review Comment:
I'm -1 on adding LD_LIBRARY_PATH in the white list by default since [it is
considered to be avoided
usually](https://www.hpc.dtu.dk/?page_id=1180#:~:text=Inconsistency%3A%20This%20is%20the%20most,compatible%20with%20the%20original%20version.).
Users should opt-in by changing the `yarn.nodemanager.env-whitelist`.
> LD_LIBRARY_PATH is missing HADOOP_COMMON_LIB_NATIVE_DIR
> -------------------------------------------------------
>
> Key: HADOOP-19001
> URL: https://issues.apache.org/jira/browse/HADOOP-19001
> Project: Hadoop Common
> Issue Type: Bug
> Affects Versions: 3.2.4
> Reporter: Zilong Zhu
> Priority: Major
> Labels: pull-request-available
>
> When we run a spark job, we find that it cannot load the native library
> successfully.
> We found a difference between hadoop2 and hadoop3.
> hadoop2-Spark-System Properties:
> |java.library.path|:/hadoop/lib/native:/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib|
> hadoop3-Spark-System Properties:
> |java.library.path|:/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib|
> The key point is:
> hadoop2-hadoop-config.sh:
> HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$JAVA_LIBRARY_PATH" <--267
> export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$JAVA_LIBRARY_PATH <--268
>
> hadoop3-hadoop-functions.sh:
> hadoop_add_param HADOOP_OPTS java.library.path \
> "-Djava.library.path=${JAVA_LIBRARY_PATH}"
> export LD_LIBRARY_PATH <--1484
>
> At the same time, the hadoop3 will clear all non-whitelisted environment
> variables.
> I'm not sure if it was intentional. But it makes our spark job unable to find
> the native library on hadoop3.
> Maybe we should modify hadoop-functions.sh(1484) and add LD_LIBRARY_PATH to
> the default configuration item yarn.nodemanager.env-whitelist.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]