From: Bahubali Jain
Sent: Thursday, March 16, 2017 11:41 PM
To: Yong Zhang
Cc: user@spark.apache.org
Subject: Re: Dataset : Issue with Save
I am using SPARK 2.0 . There are comments in the ticket since Oct-2016 which
clearly mention that issue still persists even in 2.0
ry space for the driver even there are no requests to collect data
>> back to the driver.
>>
>>
>>
>> --
>> *From:* Bahubali Jain
>> *Sent:* Thursday, March 16, 2017 1:39 PM
>> *To:* user@spark.apache.org
>> *Subject:
, March 16, 2017 10:34 PM
To: Yong Zhang
Cc: user@spark.apache.org
Subject: Re: Dataset : Issue with Save
Hi,
Was this not yet resolved?
Its a very common requirement to save a dataframe, is there a better way to
save a dataframe by avoiding data being sent to driver?.
"Total size of serial
March 16, 2017 1:39 PM
> *To:* user@spark.apache.org
> *Subject:* Dataset : Issue with Save
>
> Hi,
> While saving a dataset using *
> mydataset.write().csv("outputlocation") * I am running
> into an exception
>
>
>
> * "T
memory space for the driver even there are no requests to collect data back to
the driver.
From: Bahubali Jain
Sent: Thursday, March 16, 2017 1:39 PM
To: user@spark.apache.org
Subject: Dataset : Issue with Save
Hi,
While saving a dataset usingmydataset
Hi,
While saving a dataset using *
mydataset.write().csv("outputlocation") * I am running
into an exception
*"Total size of serialized results of 3722 tasks (1024.0 MB) is bigger than
spark.driver.maxResultSize (1024.0 MB)"*
Does it mean that for saving a dataset whole of