Hi
I guess you need to increase spark driver memory as well. But that should
be set in conf files
Let me know if that resolves
On Oct 30, 2015 7:33 AM, "karthik kadiyam" <karthik.kadiyam...@gmail.com>
wrote:

> Hi,
>
> In spark streaming job i had the following setting
>
>             this.jsc.getConf().set("spark.driver.maxResultSize", “0”);
> and i got the error in the job as below
>
> User class threw exception: Job aborted due to stage failure: Total size
> of serialized results of 120 tasks (1082.2 MB) is bigger than
> spark.driver.maxResultSize (1024.0 MB)
>
> Basically i realized that as default value is 1 GB. I changed
> the configuration as below.
>
> this.jsc.getConf().set("spark.driver.maxResultSize", “2g”);
>
> and when i ran the job it gave the error
>
> User class threw exception: Job aborted due to stage failure: Total size
> of serialized results of 120 tasks (1082.2 MB) is bigger than
> spark.driver.maxResultSize (1024.0 MB)
>
> So, basically the change i made is not been considered in the job. so my
> question is
>
> - "spark.driver.maxResultSize", “2g” is this the right way to change or
> any other way to do it.
> - Is this a bug in spark 1.3 or something or any one had this issue
> before?
>
>

Reply via email to