Hi, Zuhair
According to my experience, you could try following steps to avoid Spark
OOM:
1. Increase JVM memory by adding export SPARK_JAVA_OPTS="-Xmx2g"
2. Use .persist(storage.StorageLevel.MEMORY_AND_DISK) instead of .cache()
3. Have you set spark.executor.memory value? It's 512m by de
Dear all,
I am getting a OutOfMemoryError in class ByteString.java from package
com.google.protobuf when processing very large data using spark 0.9. Does
increasing spark.shuffle.memoryFraction helps or I should add more memory
to my workers? Below the error I get during execution.
14/05/25 07:26