Re: Hadoop Build Server

2014-03-19 Thread Omar@Gmail
OK do the following direct on mac osx mavericks 10.9.2 (i.e. without VMWare) - I svn checked out hadoop-trunk - installed xcode then installed protocol buffer - from hadood=trunk I invoked [mvn clean install] - got the following ... ... Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time ela

Re: Hadoop Build Server

2014-03-17 Thread Omar@Gmail
Thanks will check that. On 17 March 2014 13:29, Steve Loughran wrote: > sounds like your network is not consistent with hadoop's expections. VMs > are particularly fun here, while ubuntu's attempts to hide the truth hoops > your host up to 127.0.1.1 if you are not careful > > make sure that the

Re: Hadoop Build Server

2014-03-17 Thread Steve Loughran
sounds like your network is not consistent with hadoop's expections. VMs are particularly fun here, while ubuntu's attempts to hide the truth hoops your host up to 127.0.1.1 if you are not careful make sure that there are hosts entries for all the machines so that they know their own names, it mat

Re: Hadoop Build Server

2014-03-15 Thread Ted Yu
You can use brew to install protoc which is required for building hadoop 2 (and newer releases). I haven't used gcc yet. Cheers On Mar 15, 2014, at 7:30 PM, "Omar@Gmail" wrote: > Also do I understand you correctly that you have been able to setup the > hadoop development environment on mac

Re: Hadoop Build Server

2014-03-15 Thread Omar@Gmail
Also do I understand you correctly that you have been able to setup the hadoop development environment on mac os x? If so did you have to install the protocol buffer, gcc, g++ etc? On 16 March 2014 01:00, Ted Yu wrote: > If you have time, you can dig a little bit to find out why > TestNetUtils

Re: Hadoop Build Server

2014-03-15 Thread Omar@Gmail
Sure will give it a try and update you, thanks. On 16 March 2014 01:00, Ted Yu wrote: > If you have time, you can dig a little bit to find out why > TestNetUtils#testNormalizeHostName > failed (passed locally on my Mac). > > Use the following command: > > mvn clean package -DskipTests eclipse:e

Re: Hadoop Build Server

2014-03-15 Thread Ted Yu
If you have time, you can dig a little bit to find out why TestNetUtils#testNormalizeHostName failed (passed locally on my Mac). Use the following command: mvn clean package -DskipTests eclipse:eclipse After that, you can import hadoop into Eclipse. You can step into the following call: Lis

Re: Hadoop Build Server

2014-03-15 Thread Omar@Gmail
I'm following instructions from http://wiki.apache.org/hadoop/HowToContribute I've checked out hadoop project using: svn checkout http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk When trying to build from root (i.e. hadoop-trunk) I get the errors I have mentioned before. Also

Re: Hadoop Build Server

2014-03-15 Thread Ted Yu
>From https://builds.apache.org/job/Hadoop-trunk/694/console : Build timed out (after 200 minutes). Marking the build as aborted. Build was aborted The above shows how long building all sub-projects of Hadoop might take. --- https://builds.apache.org/job/Hadoop-Yarn-trunk builds hadoop-ya

Re: Hadoop Build Server

2014-03-15 Thread Omar@Gmail
Getting Results : Failed tests: TestNetUtils.testNormalizeHostName:619 expected:<[81.200.64.50]> but was:<[UnknownHost123]> TestZKFailoverController.testGracefulFailoverFailBecomingActive:484 Did not fail to graceful failover when target failed to become active! TestZKFailoverController.tes

Re: Hadoop Build Server

2014-03-15 Thread Omar@Gmail
I just took another svn update and building again will email which module is failing for me. Thanks On 15 March 2014 18:15, Ted Yu wrote: > There're several Jenkins jobs for hadoop. > e.g. > https://builds.apache.org/job/Hadoop-Yarn-trunk< > https://builds.apache.org/job/Hadoop-Yarn-trunk/510/

Re: Hadoop Build Server

2014-03-15 Thread Ted Yu
There're several Jenkins jobs for hadoop. e.g. https://builds.apache.org/job/Hadoop-Yarn-trunk https://builds.apache.org/job/Hadoop-hdfs-trunk/ Which module are you looking at ? Cheers On Sat, Mar 15, 2014 at 11:00 AM, Omar@Gmail wrot