l run
>> out of blocks before you run out space. This is one reason why HDFS/hadoop
>> is 'bad' for dealing with lots of small files.
>> >
>> > You can check here: localhost:50070 that's the web page for your hdfs
>> namenode. It has status information on your
iles, you'll run
>> > out of blocks before you run out space. This is one reason why HDFS/hadoop
>> > is 'bad' for dealing with lots of small files.
>> >
>> > You can check here: localhost:50070 that's the web page for your hdfs
>> &g
> > You can check here: localhost:50070 that's the web page for your hdfs
> namenode. It has status information on your hdfs including size.
> >
> > Pat
> >
> > -Original Message-
> > From: Cam Bazz [mailto:camb...@gmail.com]
> > Sent:
hat's the web page for your hdfs
> namenode. It has status information on your hdfs including size.
>
> Pat
>
> -Original Message-
> From: Cam Bazz [mailto:camb...@gmail.com]
> Sent: Friday, February 11, 2011 4:55 PM
> To: user@hive.apache.org
> Subject: Re: e
Original Message-
From: Cam Bazz [mailto:camb...@gmail.com]
Sent: Friday, February 11, 2011 4:55 PM
To: user@hive.apache.org
Subject: Re: error out of all sudden
but is there a ridiculously low default for hdfs space limits? I
looked everywhere in the configuration files, but could not find
an
but is there a ridiculously low default for hdfs space limits? I
looked everywhere in the configuration files, but could not find
anything that limits the size of hdfs
i think i am running on a 150GB hard drive, and the data I am
processing is in amounts of couple of hundred of megabytes at max.
Is your hdfs hitting its space limits?
Pat
-Original Message-
From: Cam Bazz [mailto:camb...@gmail.com]
Sent: Friday, February 11, 2011 4:38 PM
To: user@hive.apache.org
Subject: error out of all sudden
Hello,
I set up my one node pseudo distributed system, left with a cronjob,
copying