How are you setting this memory? You may be configuring the wrong process's
memory, like the driver and not the executors.
On Oct 1, 2014 9:37 PM, "anny9699" <anny9...@gmail.com> wrote:

> Hi,
>
> After reading some previous posts about this issue, I have increased the
> java heap space to "-Xms64g -Xmx64g", but still met the
> "java.lang.OutOfMemoryError: GC overhead limit exceeded" error. Does anyone
> have other suggestions?
>
> I am reading a data of 200 GB and my total memory is 120 GB, so I use
> "MEMORY_AND_DISK_SER" and kryo serialization.
>
> Thanks a lot!
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/still-GC-overhead-limit-exceeded-after-increasing-heap-space-tp15540.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to