Is your process getting killed...
if yes then try to see using dmesg.
On Mon, Nov 2, 2015 at 8:17 AM, karthik kadiyam <
karthik.kadiyam...@gmail.com> wrote:
> Did any one had issue setting spark.driver.maxResultSize value ?
>
> On Friday, October 30, 2015, karthik kadiyam
> wrote:
>
>> Hi Shahid
Did any one had issue setting spark.driver.maxResultSize value ?
On Friday, October 30, 2015, karthik kadiyam
wrote:
> Hi Shahid,
>
> I played around with spark driver memory too. In the conf file it was set
> to " --driver-memory 20G " first. When i changed the spark driver
> maxResultSize from
Hi Shahid,
I played around with spark driver memory too. In the conf file it was set
to " --driver-memory 20G " first. When i changed the spark driver
maxResultSize from default to 2g ,i changed the driver memory to 30G and
tired too. It gave we same error says "bigger than
spark.driver.maxResultS
Hi
I guess you need to increase spark driver memory as well. But that should
be set in conf files
Let me know if that resolves
On Oct 30, 2015 7:33 AM, "karthik kadiyam"
wrote:
> Hi,
>
> In spark streaming job i had the following setting
>
> this.jsc.getConf().set("spark.driver.maxRes