Congratulations Ishizaki-san..Thanks,Madhu._-Denny Lee wrote: -To: Dongjin Lee From: Denny Lee Date: 10/03/2018 06:31PMCc: dev Subject: Re: welcome a new batch of committ
Try Increasing the spark worker memory in conf/spark-env.sh
export SPARK_WORKER_MEMORY=2g
Thanks,
Madhu.
Ratika Prasad
Should this be done on master or slave node or both ?
From: Madhusudanan Kandasamy [mailto:madhusuda...@in.ibm.com]
Sent: Wednesday, August 19, 2015 9:31 PM
To: Ratika Prasa
Hi,
I'm new to SPARK, trying to understand the DAGScheduler code flow. As per
my understanding it looks like getMissingParentStages() doing a redundant
job of re-calculating stage dependencies. When the first stage is created
all of its dependent/parent stages would be recursively calculated and