Thanks Harsh that helped.

Trupti.

On Sat, Feb 9, 2013 at 3:51 PM, Harsh J <ha...@cloudera.com> wrote:

> Hi Trupti,
>
> Welcome! My responses inline.
>
> On Sat, Feb 9, 2013 at 7:59 PM, Trupti Gaikwad <trups.gaik...@gmail.com>
> wrote:
> > Hi,
> >
> > I want to work on release 1.0.4 source code. As per Hadoop
> > wiki HowToContribute, I can download source code from trunk or from
> release
> > 1.0.4 tag.
>
> Although I do not know your goal here, note that the trunk is the best
> place to do dev work if your goal is also to get your work accepted at
> the end. We allow 1.x to continue receiving improvements but refuse
> divergence in features compared to trunk and and the ongoing branch-2
> releases. Just something to consider!
>
> > 1. Source code from hadoop/common/trunk with revision 1397701
> corresponding
> > to release 1.0.4:
> > I downloaded the source with svn revision 1397701 mentioned in release
> tag.
> > My source code gets compiled, however tar file created by build does not
> > contain start-mapred.sh file? It does contain start-yarn.sh. Even if
> source
> > revision is old, why I am not getting start-mapred.sh. I really don't
> want
> > to use resourcemanager or node manager to run my mapred job. How can I
> > start jobtracker and tasktracker?
>
> Unfortunately SVN revisions aren't exactly what you think they are.
> What you need is to actually checkout a tag, not a revision. To get a
> 1.0.4 tag checked out from the Apache SVN repository, your command
> could be:
>
> $ svn checkout
> http://svn.apache.org/repos/asf/hadoop/common/tags/release-1.0.4/
> hadoop-1.0.4
> $ cd hadoop-1.0.4/
>
> Likewise, if you want to work on the tip of the 1.x branch instead,
> checkout the branch "branch-1":
>
> $ svn checkout
> http://svn.apache.org/repos/asf/hadoop/common/branches/branch-1/
> hadoop-1
> $ cd hadoop-1/
>
> > 2. Source code from tag release 1.0.4:
> > Hadoop wiki also mentions that, If I want to work against any specific
> > release then I will have to download release tag.
> > I copied my code to src and tried to build it. However my code is not
> > getting compiled because as I have developed it in above hadoop-common
> > project. I am getting compilation error as there are inconsistencies in
> > org.apache.hadoop.fs.FileSystem interface. Shall I develop my class by
> > implementing interfaces provided in release 1.0.4?
>
> You're attempting to build trunk (accidentally, in your case). See
> above for getting proper 1.x code.
>
> However, if you still wish to build trunk, whose build system is
> different from the older 1.x system, some simple notes for building
> trunk can be found here:
> http://wiki.apache.org/hadoop/QwertyManiac/BuildingHadoopTrunk
>
> > So
> > 1. How to get all projects from hadoop-common?
> > 2. what is correct way to compile and deploy any changes in core for
> > release 1.0.4?
>
> I believe I've answered both questions in the above inlines. Do feel
> free to post any further questions you have!
>
> --
> Harsh J
>

Reply via email to