In spark-defaults you put the values like "spark.driver.maxResultSize
 0" instead of "spark.driver.maxResultSize=0" I think.

On Sat, Feb 20, 2016 at 3:40 PM, AlexModestov <aleksandrmodes...@gmail.com>
wrote:

> I have a string spark.driver.maxResultSize=0 in the spark-defaults.conf.
> But I get an error:
>
> "org.apache.spark.SparkException: Job aborted due to stage failure: Total
> size of serialized results of 18 tasks (1070.5 MB) is bigger than
> spark.driver.maxResultSize (1024.0 MB)"
>
> But if I write --conf spark.driver.maxResultSize=0 in pyspark-shell it
> works
> fine.
>
> Could anyone know how to fix it?
> Thank you
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/spark-driver-maxResultSize-doesn-t-work-in-conf-file-tp26279.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to