CDH forum might be better place to answer this. I never used CDH

On Fri, Nov 23, 2012 at 2:28 PM, Peter Marron <
peter.mar...@trilliumsoftware.com> wrote:

>  Hi Nitin,****
>
> ** **
>
> Can I set these parameters through the CDH management interface?****
>
> If not then what file do they need to be set in to make sure that CDH****
>
> picks them up?****
>
> ** **
>
> Peter Marron****
>
> Trillium Software UK Limited****
>
> ** **
>
> Tel : +44 (0) 118 940 7609****
>
> Fax : +44 (0) 118 940 7699****
>
> E: peter.mar...@trilliumsoftware.com <roy.willi...@trilliumsoftware.com>**
> **
>
> ** **
>
> *From:* Nitin Pawar [mailto:nitinpawar...@gmail.com]
> *Sent:* 23 November 2012 08:55
> *To:* user@hive.apache.org
> *Subject:* Re: Creating Indexes again****
>
> ** **
>
> try increasing ulimit on your hadoop cluster as well increase the memory
> for map and reducer both by setting them up on hive ****
>
> set mapred.job.map.memory.mb=6000;****
>
> set mapred.job.reduce.memory.mb=4000;****
>
> ** **
>
> you can change the values based on the hadoop cluster you have setup ****
>
> ** **
>
> ** **
>
> On Fri, Nov 23, 2012 at 2:17 PM, Peter Marron <
> peter.mar...@trilliumsoftware.com> wrote:****
>
> Hi,****
>
>  ****
>
> I’m trying to create indexes in Hive, and I’ve switched****
>
> to using CDH-4. The creation of the index is failing and****
>
> it’s pretty obvious that the reducers are running out of****
>
> heap space. When I use the web interface for the****
>
> “Hadoop reduce task list” I can find this entry:****
>
>  ****
>
> Error: Java heap space****
>
> Error: GC overhead limit exceeded****
>
> org.apache.hadoop.io.SecureIOUtils$AlreadyExistsException: EEXIST: File
> exists****
>
>         at
> org.apache.hadoop.io.SecureIOUtils.createForWrite(SecureIOUtils.java:178)*
> ***
>
>         at
> org.apache.hadoop.mapred.TaskLog.writeToIndexFile(TaskLog.java:303)****
>
>         at org.apache.hadoop.mapred.TaskLog.syncLogs(TaskLog.java:376)****
>
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:270)****
>
>         at java.security.AccessController.doPrivileged(Native Method)****
>
>         at javax.security.auth.Subject.doAs(Subject.java:396)****
>
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> ****
>
>         at org.apache.hadoop.mapred.Child.main(Child.java:262)****
>
> Caused by: EEXIST: File exists****
>
>         at org.apache.hadoop.io.nativeio.NativeIO.open(Native Method)****
>
>         at
> org.apache.hadoop.io.SecureIOUtils.createForWrite(SecureIOUtils.java:172)*
> ***
>
>         ... 7 more****
>
>  ****
>
> Error: GC overhead limit exceeded****
>
>  ****
>
> *Al*****
>
>  ****
>
> If Also when the****
>
> If Also when the****
>
>  ****
>
>  ****
>
> If Also when the****
>
>  ****
>
> If this e-mail shouldn’t be here and should only be on****
>
> a cloudera mailing list, please re-direct me.****
>
>  ****
>
> Thanks in advance.****
>
>  ****
>
> Peter Marron****
>
> Trillium Software UK Limited****
>
>  ****
>
> Tel : +44 (0) 118 940 7609****
>
> Fax : +44 (0) 118 940 7699****
>
> E: peter.mar...@trilliumsoftware.com <roy.willi...@trilliumsoftware.com>**
> **
>
>  ****
>
>
>
> ****
>
> ** **
>
> --
> Nitin Pawar****
>



-- 
Nitin Pawar

Reply via email to