It was missing the c++ libraries though:
https://builds.apache.org/job/PreCommit-HADOOP-Build/2002/artifact/trunk/patchprocess/patchJavacWarnings.txt
-Todd
On Mon, Jan 7, 2013 at 3:46 PM, Giridharan Kesavan
wrote:
> I did install protoc on hadoop9 and brought it back online after testing it
> co
I did install protoc on hadoop9 and brought it back online after testing it
couple of hours back.
-Giri
On Mon, Jan 7, 2013 at 3:35 PM, Todd Lipcon wrote:
> I'll install the right protoc and libstdc++ dev on asf009 as well.
>
> -Todd
>
> On Mon, Jan 7, 2013 at 9:57 AM, Andrew Wang
> wrote:
>
OK. FYI, installed protoc on asf009, and the "g++-4.4-multilib"
packages on both asf008 and asf009. Checked the hadoop pipes native
build and it passes now. Fingers crossed...
-Todd
On Mon, Jan 7, 2013 at 3:35 PM, Todd Lipcon wrote:
> I'll install the right protoc and libstdc++ dev on asf009 as
I'll install the right protoc and libstdc++ dev on asf009 as well.
-Todd
On Mon, Jan 7, 2013 at 9:57 AM, Andrew Wang wrote:
> I think hadoop9 has a similar problem as hadoop8, based on a recent build.
> The javac output has a compile-proto error:
>
> https://builds.apache.org/job/PreCommit-HDFS-
I think hadoop9 has a similar problem as hadoop8, based on a recent build.
The javac output has a compile-proto error:
https://builds.apache.org/job/PreCommit-HDFS-Build/3755/
https://builds.apache.org/job/PreCommit-HDFS-Build/3755/artifact/trunk/patchprocess/trunkJavacWarnings.txt
On Sun, Jan 6
HAServiceProtocol.proto:21:8: Option "java_generic_services" unknown.
This is probably caused by a older version of protoc in build env.
On Sun, Jan 6, 2013 at 2:12 PM, Giridharan Kesavan wrote:
> by looking at the failure log :
>
> https://builds.apache.org/view/Hadoop/job/PreCommit-HADOOP-Bui
by looking at the failure log :
https://builds.apache.org/view/Hadoop/job/PreCommit-HADOOP-Build/1950/artifact/trunk/patchprocess/trunkJavacWarnings.txt
build failed on
[INFO] --- exec-maven-plugin:1.2:exec (compile-proto) @ hadoop-common ---
HAServiceProtocol.proto:21:8: Option "java_generic_ser
I am not sure if this problem is solved, the build still failed in
precommit-HADOOP
https://builds.apache.org/view/Hadoop/job/PreCommit-HADOOP-Build/
On Sat, Jan 5, 2013 at 6:46 AM, Giridharan Kesavan wrote:
> Marking the slave offline would do. I 've mared the hadoop8 slave offline,
> while I
Guys should builds@ be copied on this?
Cheers,
Chris
On 1/4/13 11:15 AM, "Todd Lipcon" wrote:
>I've always liked puppet for distributing config files, but always
>though it kind of silly for distributing big binaries like toolchains.
>Seems just as easy to just make a 15-line shell script to wg
Marking the slave offline would do. I 've mared the hadoop8 slave offline,
while I test it for builds and bring it back online later when its good.
-Giri
On Fri, Jan 4, 2013 at 2:26 PM, Todd Lipcon wrote:
> Turns out I had to both kill -9 it and chmod 000
> /home/jenkins/jenkins-slave in orde
I think I installed protoc in /usr/local and this is what I see
gkesavan@asf008:~$ which protoc
/usr/local/bin/protoc
-Giri
On Fri, Jan 4, 2013 at 2:11 PM, Todd Lipcon wrote:
> I'm going to kill -9 the jenkins slave on hadoop8 for now cuz it's
> causing havoc on the precommit builds. I can't
Turns out I had to both kill -9 it and chmod 000
/home/jenkins/jenkins-slave in order to keep it from auto-respawning.
Just a note so that once the toolchain is fixed, someone knows to
re-chmod back to 755.
-Todd
On Fri, Jan 4, 2013 at 2:11 PM, Todd Lipcon wrote:
> I'm going to kill -9 the jenki
I'm going to kill -9 the jenkins slave on hadoop8 for now cuz it's
causing havoc on the precommit builds. I can't see another way to
administratively disable it from the Jenkins interface.
Rajiv, Giri -- mind if I build/install protoc into /usr/local to match
the other slaves? We can continue the
In addition to protoc, can someone please also install a 32-bit C++ compiler?
The builds are all failing on this machine because of that.
regards,
Colin
On Fri, Jan 4, 2013 at 11:37 AM, Giridharan Kesavan
wrote:
> When I configured the other machines I used the source to compile and
> install
When I configured the other machines I used the source to compile and
install the protoc, as the 2.4.1 wasn't available in the ubuntu repo.
BTW installed 2.4.1 on asf008.
gkesavan@asf008:~$ protoc --version
libprotoc 2.4.1
-Giri
On Thu, Jan 3, 2013 at 11:24 PM, Todd Lipcon wrote:
> Hey folks
No, there is no CM on these machines. Whats required is mentioned in
the d-i or post scripts.
Since the conversation about build slaves came up, is there a
requirement for having ubuntu. I would prefer to install new machines
with centos. (d-i is a pta. Sorry, I don't want to start a debate here)
I've always liked puppet for distributing config files, but always
though it kind of silly for distributing big binaries like toolchains.
Seems just as easy to just make a 15-line shell script to wget, tar
xzf, and make install.
Definitely agree puppet makes sense for ensuring the right deb
packag
Do I hear puppet? :)
Cos
On Fri, Jan 04, 2013 at 11:08AM, Todd Lipcon wrote:
> I agree -- I'd like to see us have a shell script of some sort which,
> given a prefix, downloads and installs the needed toolchain
> dependencies.
>
> We could then download that script onto the build machines and in
I agree -- I'd like to see us have a shell script of some sort which,
given a prefix, downloads and installs the needed toolchain
dependencies.
We could then download that script onto the build machines and install
into something like /opt/hadoop-toolchain/
AFAIK the only real dependencies we have
asf008 has been up for a while. It was probably just added as a slave.
All the dependencies should probably be installed in a build_prefix, to
avoid conflict to OS specific packages and allows multiple projects to
build on the same machines. This is an better alternative to
provisioning vms for un
Im on it
-Giri
On Thu, Jan 3, 2013 at 11:24 PM, Todd Lipcon wrote:
> Hey folks,
>
> It looks like hadoop8 has recently come back online as a build slave,
> but is failing all the builds because it has an ancient version of
> protobuf (2.2.0):
> todd@asf008:~$ protoc --version
> libprotoc 2.2
Hey folks,
It looks like hadoop8 has recently come back online as a build slave,
but is failing all the builds because it has an ancient version of
protobuf (2.2.0):
todd@asf008:~$ protoc --version
libprotoc 2.2.0
In contrast, other slaves have 2.4.1:
todd@asf001:~$ protoc --version
libprotoc 2.
22 matches
Mail list logo