Hi all,
I’m running trunk (204a2055b1b9270ae13ea03b7aeac62b65166efd) and when I run 
‘hdfs datanode’ it doesn’t appear to respect the HADOOP_CONF_DIR in order to 
find the hdfs-site.xml and log4j.properties file. This is particularly strange 
since ‘hdfs namenode’ seems pretty content to pick up the appropriate files 
from the HADOOP_CONF_DIR.

I can use ‘hdfs datanode –conf /path/to/hdfs-site.xml’ but this won’t pick up 
the log4j.properties file.

Has the name of the HADOOP_CONF_DIR changed for the datanode? Or is it supposed 
to look up some other configuration file (not hdfs-site.xml)? Or maybe it’s a 
regression?

Thanks,
Ewan
Western Digital Corporation (and its subsidiaries) E-mail Confidentiality 
Notice & Disclaimer:

This e-mail and any files transmitted with it may contain confidential or 
legally privileged information of WDC and/or its affiliates, and are intended 
solely for the use of the individual or entity to which they are addressed. If 
you are not the intended recipient, any disclosure, copying, distribution or 
any action taken or omitted to be taken in reliance on it, is prohibited. If 
you have received this e-mail in error, please notify the sender immediately 
and delete the e-mail in its entirety from your system.

Reply via email to