Hi,
have you formated your name node by using
hadoop namenode -format
Regards
Yogesh Kumar
> From: are9...@nyp.org
> To: user@hive.apache.org; common-u...@hadoop.apache.org
> Date: Sat, 7 Jul 2012 21:37:16 -0400
> Subject: Re: namenode starting error
>
> Check your
quot;
> From: are9...@nyp.org
> To: user@hive.apache.org; common-u...@hadoop.apache.org
> Date: Sat, 7 Jul 2012 21:37:16 -0400
> Subject: Re: namenode starting error
>
> Check your firewall settings, specifically if port 54310 is open. The
> other ports to look at are:
>
>
Check your firewall settings, specifically if port 54310 is open. The
other ports to look at are:
50010
50020
50030
50060
50070
54311
On 6/20/12 5:22 AM, "soham sardar" wrote:
>the thing is i have updated my JAVA_HOME in both .bashrc and hadoop-env.sh
>
>
>now the problem is when i try
>jps is
this is because you are you are not able to contact the namenode as it
is not running.not only -ls, none of the Hdfs command will work in
such a situation..was your NN stopped after getting started, or it
didn't start even once??Can you paste your logs here??
Regards,
Mohammad Tariq
On Wed,
No i guess it still doesnt serve the purpose actually i mean the error is
hduser@XPS-L501X:/home/soham/cloudera/hadoop-2.0.0-cdh4.0.0/etc/hadoop$
hadoop fs -ls
12/06/20 15:03:51 INFO ipc.Client: Retrying connect to server:
localhost/127.0.0.1:54310. Already tried 0 time(s).
12/06/20 15:03:52 INFO i
change the line "127.0.1.1" in your /etc/hosts file to
"127.0.0.1"..Also check if there is any problem with the configuration
properties.
Regards,
Mohammad Tariq
On Wed, Jun 20, 2012 at 2:52 PM, soham sardar wrote:
> the thing is i have updated my JAVA_HOME in both .bashrc and hadoop-env.sh