Hi, All
When I run spark applications, I see from the web-ui that some stage
description are like "apply at Option.scala:120".
Why spark splits a stage on a line that is not in my spark program but a Scala
library?
Thanks
Jensen
ayur_rustagi>
On Mon, Jul 28, 2014 at 8:28 PM, Wang, Jensen
mailto:jensen.w...@sap.com>> wrote:
Hi, All
Before sc.runJob invokes dagScheduler.runJob, the func performed
on the rdd is “cleaned” by ClosureCleaner.clearn.
Why spark has to do this? What’s the purpose?
Hi, All
Before sc.runJob invokes dagScheduler.runJob, the func performed
on the rdd is "cleaned" by ClosureCleaner.clearn.
Why spark has to do this? What's the purpose?
<http://www.sparrowmailapp.com/?sig>
On Tuesday, July 22, 2014 at 10:18 AM, Wang, Jensen wrote:
Hi,
I started to use spark on yarn recently and found a problem while
tuning my program.
When SparkContext is initialized as sc and ready to read text file from hdfs,
the textFil
Hi,
I started to use spark on yarn recently and found a problem while
tuning my program.
When SparkContext is initialized as sc and ready to read text file from hdfs,
the textFile(path, defaultMinPartitions) method is called.
I traced down the second parameter in the spark source code an