Hi, Amandeep,
I've copied following lines from a site:
----------
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
This can have two reasons:
* Your Java application has a memory leak. There are tools like
YourKit Java Profiler that help you to identify such leaks.
* Your Java application really needs a lot of memory (more than
128 MB by default!). In this case the Java heap size can be increased
using the following runtime parameters:
java -Xms<initial heap size> -Xmx<maximum heap size>
Defaults are:
java -Xms32m -Xmx128m
You can set this either in the Java Control Panel or on the command
line, depending on the environment you run your application.
---------
Hope this helps,
Rasit
2009/2/7 Amandeep Khurana <[email protected]>:
> I'm getting the following error while running my hadoop job:
>
> 09/02/06 15:33:03 INFO mapred.JobClient: Task Id :
> attempt_200902061333_0004_r_000000_1, Status : FAILED
> java.lang.OutOfMemoryError: Java heap space
> at java.util.Arrays.copyOf(Unknown Source)
> at java.lang.AbstractStringBuilder.expandCapacity(Unknown Source)
> at java.lang.AbstractStringBuilder.append(Unknown Source)
> at java.lang.StringBuffer.append(Unknown Source)
> at TableJoin$Reduce.reduce(TableJoin.java:61)
> at TableJoin$Reduce.reduce(TableJoin.java:1)
> at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:430)
> at org.apache.hadoop.mapred.Child.main(Child.java:155)
>
> Any inputs?
>
> Amandeep
>
>
> Amandeep Khurana
> Computer Science Graduate Student
> University of California, Santa Cruz
>
--
M. Raşit ÖZDAŞ