Thanks! Can i compile just the source at the repo and use it just as is? I mean, without having any hadoop source code (except the hdfs code at the web i told you). Or without the need to integrate it with a hadoop compiiled code? Just as if a diferent or standalone project.
On Thu, Jun 10, 2010 at 10:50 PM, Jitendra Nath Pandey <jiten...@yahoo-inc.com> wrote: > You can test hdfs without setting up map-reduce cluster if that's what you > mean. > > Instead of bin/start-all.sh , use bin/start-dfs.sh and you can skip > configurations related to mapreduce. > > To test it, use DFS command line "bin/hadoop dfs". > > > On 6/10/10 1:16 PM, "Alberich de megres" <alberich...@gmail.com> wrote: > > Thanks for the quick reply, > > But I'm talking about just hdfs.. is it posible to test it separately? > with source code available at: > http://github.com/apache/hadoop-hdfs > > I compiled it, and now i want to test it. (aside from hadoop) > > > On Thu, Jun 10, 2010 at 9:37 PM, Jitendra Nath Pandey > <jiten...@yahoo-inc.com> wrote: >> This link should help. >> http://wiki.apache.org/hadoop/QuickStart >> >> >> On 6/10/10 12:20 PM, "Alberich de megres" <alberich...@gmail.com> wrote: >> >> Hello! >> >> I'm new on HDFS, i just downloaded the source code and compiled it. >> >> Now I want to excecure it on 2 machines.. but i don't know how to start >> servers. >> >> Is there any web/doc or someone can point me some light on how to start? >> >> Thanks!! >> Alberich >> >> > >