I have checked, the classpath and $HADOOP_CONF_DIR respectively contain and point to my conf folder.
Regards
Saptarshi

On Jan 6, 2009, at 12:53 AM, Sharad Agarwal wrote:

bin/hadoop script does that. It puts the $HADOOP_CONF_DIR in the classpath. Try to echo $HADOOP_CONF_DIR and $CLASSPATH in the bin/ hadoop script and see whether those getting picked up correctly.

Saptarshi Guha wrote:

Possibly. When i force load the configuration
        Configuration cf = new Configuration();
        LOG.info("Adding new resource");
       cf.addResource("hadoop-site.xml")

It doesn't load, though hadoop-site.xml is present in
hadoop-0.19.0/conf, and the hadoop-0.19 folder is replicated across
all the machines. However if load it via
        cf.addResource(new
Path("/home/godhuli/custom/hadoop-0.19.0/conf/hadoop-site.xml"));
It works! So clearly, there is some other hadoop-site.xml in the
classpath, but where? Should I add
/home/godhuli/custom/hadoop-0.19.0/conf/ to the Hadoop classpath in
hadoop-env.sh?

Thanks
Saptarshi

On Mon, Jan 5, 2009 at 5:09 PM, Jason Venner <[email protected]> wrote: > somehow you have alternate versions of the file earlier in the class path.
>
> Perhaps someone's empty copies are bundled into one of your application jar
> files.
>
> Or perhaps the configurationfiles are not distributed to the datanodes in
> the expected locations.
>
> Saptarshi Guha wrote:
>>
>> For some strange reason, neither hadoop-default.xml nor
>> hadoop-site.xml is loading. Both files are in hadoop-0.19.0/conf/
>> folder.
>> HADOOP_CONF_DIR points to thi
>> In some java code, (not a job), i did
>>        Configuration cf = new Configuration();
>>        String[] xu= new
>>
>> String[] {"io .file .buffer .size ","io .sort .mb ","io .sort .factor ","mapred .tasktracker .map .tasks .maximum ","hadoop .logfile .count ","hadoop .logfile .size ","hadoop .tmp .dir ","dfs .replication ","mapred.map.tasks","mapred.job.tracker","fs.default.name"};
>>        for(String x : xu){
>>            LOG.info(x+": "+cf.get(x));
>>        }
>>
>> All the values returned were default values(as noted in
>> hadoop-default.xml). Changing the value in hadoop-default.xml were not
>> reflected here. Grepping the source reveal  the defaults are
>> hardcoded. So it seems neither is hadoop-default.xml not
>> hadooop-site.xml loading.
>> Also, Configuration.java mentionds that hadoop-site.xml is depecated
>> (thought still loaded)
>>
>> Any suggestions?
>> Regards
>> Saptarshi
>>
>>
>> On Mon, Jan 5, 2009 at 2:26 PM, Saptarshi Guha <[email protected] >
>> wrote:
>>
>>>
>>> Hello,
>>> I have set my HADOOP_CONF_DIR to the conf folder and still not
>>> loading. I have to manually set the options when I create my conf.
>>> Have you resolved this?
>>>
>>> Regards
>>> Saptarshi
>>>
>>> On Tue, Dec 30, 2008 at 5:25 PM, g00dn3ss <[email protected]> wrote:
>>>
>>>>
>>>> Hey all,
>>>>
>>>> I have a similar issue. I am specifically having problems with the
>>>> config
>>>> option "mapred.child.java.opts." I set it to -Xmx1024m and it uses
>>>> -Xmx200m
>>>> regardless. I am running Hadoop 0.18.2 and I'm pretty sure this option
>>>> was
>>>> working in the previous versions of Hadoop I was using.
>>>>
>>>> I am not explicitly setting HADOOP_CONF_DIR. My site config is in >>>> ${HADOOP_HOME}/conf. Just to test things further, I wrote a small map
>>>> task
>>>> to print out the ENV values and it has the correct value for
>>>> HADOOP_HOME,
>>>> HADOOP_LOG_DIR, HADOOP_OPTS, etc... I also printed out the key/values
>>>> in
>>>> the JobConf passed to the mapper and it has my specified values for
>>>> fs.default.name and mapred.job.tracker.  Other settings like
>>>> dfs.name.dir,
>>>> dfs.data.dir, and mapred.child.java.opts do not have my values.
>>>>
>>>> Any suggestion where to look at next?
>>>>
>>>> Thanks!
>>>>
>>>>
>>>>
>>>> On Mon, Dec 29, 2008 at 10:27 PM, Amareshwari Sriramadasu <
>>>> [email protected]> wrote:
>>>>
>>>>
>>>>>
>>>>> Saptarshi Guha wrote:
>>>>>
>>>>>
>>>>>>
>>>>>> Hello,
>>>>>> I had previously emailed regarding heap size issue and have discovered
>>>>>> that the hadoop-site.xml is not loading completely, i.e
>>>>>>  Configuration defaults = new Configuration();
>>>>>>       JobConf jobConf = new JobConf(defaults, XYZ.class);
>>>>>> System.out.println("1:"+jobConf.get("mapred.child.java.opts")); >>>>>> System.out.println("2:"+jobConf.get("mapred.map.tasks")); >>>>>> System.out.println("3:"+jobConf.get("mapred.reduce.tasks"));
>>>>>>
>>>>>>
>>>>>> System .out .println ("3:"+jobConf.get("mapred.tasktracker.reduce.tasks.maximum"));
>>>>>>
>>>>>> returns -Xmx200m, 2,1,2 respectively, even though the numbers in the
>>>>>> hadoop-site.xml are very different.
>>>>>>
>>>>>> Is there a way for hadoop to dump the parameters read in from
>>>>>> hadoop-site.xml and hadoop-default.xml?
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>> Is your hadoop-site.xml present in the conf (HADOOP_CONF_DIR)
>>>>> directory?
>>>>>
>>>>> 
http://hadoop.apache.org/core/docs/r0.19.0/cluster_setup.html#Configuration
>>>>>
>>>>> -Amareshwari
>>>>>
>>>>>
>>>
>>> --
>>> Saptarshi Guha - [email protected]
>>>
>>>
>>
>>
>>
>>
>



--
Saptarshi Guha - [email protected]



Saptarshi Guha | [email protected] | http://www.stat.purdue.edu/~sguha
There's nothing wrong with teenagers that
reasoning with them won't aggravate.

Reply via email to