Hi Team,
I have got a problem building Hadoop with the proxy settings.
My linux machine has maven proxy settings configured and working fine, but
the build fails with the following error, inspite of passing the username and
pwd.
mvn package -Pdist -Dtar -Dhttp.proxyHost=***.com -Dhttp.proxyPo
Hi Mingxi,
>From your stacktrace, I understand that the OutOfMemoryError has actually
>occured while copying the MapOutputs, not while sorting them.
Since your Mapoutputs are huge and your reducer does have enough heap memory,
you got the problem.
When you have made the reducers to 200, your Ma
e to copy them?
Can you please kindly explain what's the function of mapred.child.java.opts?
how does it relate to copy?
Thank you,
Mingxi
-Original Message-----
From: Ravi teja ch n v [mailto:raviteja.c...@huawei.com]
Sent: Tuesday, November 29, 2011 9:46 PM
To: common-dev@hadoop.apa
e to copy them?
Can you please kindly explain what's the function of mapred.child.java.opts?
how does it relate to copy?
Thank you,
Mingxi
-Original Message-----
From: Ravi teja ch n v [mailto:raviteja.c...@huawei.com]
Sent: Tuesday, November 29, 2011 9:46 PM
To: common-dev@hadoop.apa
And no the box did not have 300 GB of RAM.
--Bobby Evans
On 12/1/11 4:12 AM, "Ravi teja ch n v" wrote:
Hi Mingxi ,
>So, why when map outputs are huge, reducer will not able to copy them?
The Reducer will copy the Map output into its inmemory buffer. When the
Reducer JVM does
Ravi Teja Ch N V created HADOOP-13124:
-
Summary: Support weights/priority for user in Faircallqueue
Key: HADOOP-13124
URL: https://issues.apache.org/jira/browse/HADOOP-13124
Project: Hadoop Common
URL: https://issues.apache.org/jira/browse/HADOOP-7926
Project: Hadoop Common
Issue Type: Bug
Components: build
Affects Versions: 0.23.0
Reporter: Ravi Teja Ch N V
This approach will help to know all the failures even if some testcase fails