Hi,
It looks like this is not related to Alluxio. Have you tried running the
same job with different storage?
Maybe you could increase the Spark JVM heap size to see if that helps your
issue?
Hope that helps,
Gene
On Wed, Jun 15, 2016 at 8:52 PM, Chanh Le wrote:
> Hi everyone,
> I added more
Hi everyone,
I added more logs for my use case:
When I cached all my data 500 mil records and count.
I receive this.
16/06/16 10:09:25 ERROR TaskSetManager: Total size of serialized results of 27
tasks (1876.7 MB) is bigger than spark.driver.maxResultSize (1024.0 MB)
>>> that weird because I just
Hi Gene,
I am using Alluxio 1.1.0.
Spark 2.0 Preview version.
Load from alluxio then cached and query for 2nd time. Spark will stuck.
> On Jun 15, 2016, at 8:42 PM, Gene Pang wrote:
>
> Hi,
>
> Which version of Alluxio are you using?
>
> Thanks,
> Gene
>
> On Tue, Jun 14, 2016 at 3:45 AM,
Hi,
Which version of Alluxio are you using?
Thanks,
Gene
On Tue, Jun 14, 2016 at 3:45 AM, Chanh Le wrote:
> I am testing Spark 2.0
> I load data from alluxio and cached then I query but the first query is ok
> because it kick off cache action. But after that I run the query again and
> it’s st