I'm setting up a Hadoop cluster and I have the name node and job tracker up and running. However, I cannot get any of my datanodes or tasktrackers to start. Here is my hadoop-site.xml file...
<code><?xml version="1.0"?> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?> <!-- Put site-specific property overrides in this file. --> <configuration> <property> <name>hadoop.tmp.dir</name> <value>/home/hadoop/h_temp</value> <description>A base for other temporary directories.</description> </property> <property> <name>dfs.data.dir</name> <value>/home/hadoop/data</value> </property> <property> <name>fs.default.name</name> <value>192.168.1.10:54310</value> <description>The name of the default file system. A URI whose scheme and authority determine the FileSystem implementation. The uri's scheme determines the config property (fs.SCHEME.impl) naming the FileSystem implementation class. The uri's authority is used to determine the host, port, etc. for a filesystem.</description> <final>true</final> </property> <property> <name>mapred.job.tracker</name> <value>192.168.1.10:54311</value> <description>The host and port that the MapReduce job tracker runs at. If "local", then jobs are run in-process as a single map and reduce task. </description> </property> <property> <name>dfs.replication</name> <value>0</value> <description>Default block replication. The actual number of replications can be specified when the file is created. The default is used if replication is not specified in create time. </description> </property> </configuration> </code> and here is the error I'm getting... <code>2009-04-15 14:00:48,208 INFO org.apache.hadoop.dfs.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = java.net.UnknownHostException: myhost: myhost STARTUP_MSG: args = [] STARTUP_MSG: version = 0.18.3 STARTUP_MSG: build = https://svn.apache.org/repos/asf/hadoop/core/branches/branch-0.18 -r 736250; compiled by 'ndaley' on Thu Jan 22 23:12:0$ ************************************************************/ 2009-04-15 14:00:48,355 ERROR org.apache.hadoop.dfs.DataNode: java.net.UnknownHostException: myhost: myhost at java.net.InetAddress.getLocalHost(InetAddress.java:1353) at org.apache.hadoop.net.DNS.getDefaultHost(DNS.java:185) at org.apache.hadoop.dfs.DataNode.startDataNode(DataNode.java:249) at org.apache.hadoop.dfs.DataNode.<init>(DataNode.java:223) at org.apache.hadoop.dfs.DataNode.makeInstance(DataNode.java:3071) at org.apache.hadoop.dfs.DataNode.instantiateDataNode(DataNode.java:3026) at org.apache.hadoop.dfs.DataNode.createDataNode(DataNode.java:3034) at org.apache.hadoop.dfs.DataNode.main(DataNode.java:3156) 2009-04-15 14:00:48,356 INFO org.apache.hadoop.dfs.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at java.net.UnknownHostException: myhost: myhost ************************************************************/ </code> -- View this message in context: http://www.nabble.com/Datanode-Setup-tp23064660p23064660.html Sent from the Hadoop core-user mailing list archive at Nabble.com.
