Look at the tuning guide on Spark's webpage for strategies to cope with this. I have run into quite a few memory issues like these, some are resolved by changing the StorageLevel strategy and employing things like Kryo, some are solved by specifying the number of tasks to break down a given operation into etc.

Ognen

On 3/27/14, 10:21 AM, Sai Prasanna wrote:
"java.lang.OutOfMemoryError: GC overhead limit exceeded"

What is the problem. The same code, i run, one instance it runs in 8 second, next time it takes really long time, say 300-500 seconds... I see the logs a lot of GC overhead limit exceeded is seen. What should be done ??

Please can someone throw some light on it ??



--
*Sai Prasanna. AN*
*II M.Tech (CS), SSSIHL*
*
Entire water in the ocean can never sink a ship, Unless it gets inside.
All the pressures of life can never hurt you, Unless you let them in.*

Reply via email to