Re: Empty value in write() method : Custom Datatype in Hadoop MapReduce

2013-10-30 Thread unmesha sreeveni
Empty value in write() method : Custom Datatype in Hadoop MapReduce[Solved] On Wed, Oct 30, 2013 at 11:23 AM, unmesha sreeveni wrote: > I am emiting two 2D double array as key and value.I am in construction of > my WritableComparable class. > > > *public class MF implements WritableComparable{ >

Re: test-patch failing with OOM errors in javah

2013-10-30 Thread Omkar Joshi
btw I am doing this on my local mac machine.. not on build machines. Thanks, Omkar Joshi *Hortonworks Inc.* On Wed, Oct 30, 2013 at 4:47 PM, Omkar Joshi wrote: > yes even I do the same.. I just use this to build > > export _JAVA_OPTIONS="-Djava.awt.headless=true -X

Re: test-patch failing with OOM errors in javah

2013-10-30 Thread Omkar Joshi
yes even I do the same.. I just use this to build export _JAVA_OPTIONS="-Djava.awt.headless=true -Xmx2048m -Xms2048m" mvn clean install package -Pdist -Dtar -DskipTests -Dmaven.javadoc.skip=true Thanks, Omkar Joshi *Hortonworks Inc.* On Wed, Oct 30, 2013 at 3:39 PM

[jira] [Created] (HADOOP-10077) o.a.h.s.Groups should refresh in the background

2013-10-30 Thread Colin Patrick McCabe (JIRA)
Colin Patrick McCabe created HADOOP-10077: - Summary: o.a.h.s.Groups should refresh in the background Key: HADOOP-10077 URL: https://issues.apache.org/jira/browse/HADOOP-10077 Project: Hadoop Co

Re: test-patch failing with OOM errors in javah

2013-10-30 Thread Roman Shaposhnik
I can take a look sometime later today. Meantime I can only say that I've been running into 1Gb limit in a few builds as of late. These days -- I just go with 2G by default. Thanks, Roman. On Wed, Oct 30, 2013 at 3:33 PM, Alejandro Abdelnur wrote: > The following is happening in builds for MAPRE

test-patch failing with OOM errors in javah

2013-10-30 Thread Alejandro Abdelnur
The following is happening in builds for MAPREDUCE and YARN patches. I've seen the failures in hadoop5 and hadoop7 machines. I've increased Maven memory to 1GB (export MAVEN_OPTS="-Xmx1024m" in the jenkins jobs) but still some failures persist: https://builds.apache.org/job/PreCommit-MAPREDUCE-Buil

Re: Question on hadoop dependencies.

2013-10-30 Thread Petar Tahchiev
Hi Roman, looks like they have already upgraded to 2.2 https://issues.apache.org/jira/browse/SOLR-5382 and will be shipping it SOLR 4.6. I just hope you guys release cleaned 2.3 first :) 2013/10/30 Roman Shaposhnik > On Wed, Oct 30, 2013 at 1:07 PM, Steve Loughran > wrote: > > On 30 October

Re: Question on hadoop dependencies.

2013-10-30 Thread Roman Shaposhnik
On Wed, Oct 30, 2013 at 1:07 PM, Steve Loughran wrote: > On 30 October 2013 13:07, Petar Tahchiev wrote: >> So spring-data-solr (1.1.SNAPSHOT) uses solr 4.5.1 (just came out a few >> days ago), which uses Hadoop 2.0.5-alpha. >> I would be glad if we can clean up the poms a bit and leave only the

Re: Question on hadoop dependencies.

2013-10-30 Thread Steve Loughran
On 30 October 2013 13:07, Petar Tahchiev wrote: > Oh, hi Steve, > > didn't know you were on this list :) ... > Well I didn' t know you were doing Hadoop stuff > > So spring-data-solr (1.1.SNAPSHOT) uses solr 4.5.1 (just came out a few > days ago), which uses Hadoop 2.0.5-alpha. > I would be

Hadoop in Fedora updated to 2.2.0

2013-10-30 Thread Robert Rati
I've updated the version of Hadoop in Fedora 20 to 2.2.0. This means Hadoop 2.2.0 will be the included in the official release of Fedora 20. Hadoop on Fedora is running against numerous updated dependencies, including: Java 7 (OpenJDK IcedTea) Jetty 9 Tomcat 7 Jets3t 0.9.0 I've logged/update

Re: Question on hadoop dependencies.

2013-10-30 Thread Petar Tahchiev
Oh, hi Steve, didn't know you were on this list :) ... So spring-data-solr (1.1.SNAPSHOT) uses solr 4.5.1 (just came out a few days ago), which uses Hadoop 2.0.5-alpha. I would be glad if we can clean up the poms a bit and leave only the dependencies that hadoop really depend on. I'll drop an em

Re: Question on hadoop dependencies.

2013-10-30 Thread Steve Loughran
Why hello Peter, Which version are you using? The reason those dependencies are declared are because things like Jetty use them -and the classpath for the server side Hadoop is "things needed to run Hadoop". Client-side I think there's too much in the maven dependency tree (servlets, jetty, ...)

Build failed in Jenkins: Hadoop-Common-trunk #937

2013-10-30 Thread Apache Jenkins Server
See Changes: [sandy] YARN-1306. Clean up hadoop-sls sample-conf according to YARN-1228 (Wei Yan via Sandy Ryza) [arp] HDFS-5436. Move HsFtpFileSystem and HFtpFileSystem into org.apache.hdfs.web. (Contributed by Haohui Mai) [bikas