10, 2014 at 8:56 AM, Sen, Ranjan [USA] wrote:
> Hi Sourav
> That makes so much sense. Thanks much.
> Ranjan
>
> From: Sourav Chandra
> Reply-To: "user@spark.apache.org"
> Date: Sunday, March 9, 2014 at 10:37 PM
> To: "user@spark.apache.org"
> Su
user@spark.apache.org<mailto:user@spark.apache.org>"
mailto:user@spark.apache.org>>
Subject: Re: [External] Re: no stdout output from worker
Hi Ranjan,
Whatever code is being passed as closure to spark operations like map, flatmap,
filter etc are part of task
All others are in drive
Hi Ranjan,
Whatever code is being passed as closure to spark operations like map,
flatmap, filter etc are part of task
All others are in driver.
Thanks,
Sourav
On Mon, Mar 10, 2014 at 12:03 PM, Sen, Ranjan [USA] wrote:
> Hi Patrick
>
> How do I know which part of the code is in the driver and
Just a correction - what the strange symbol which was auto generated is
for “. . . . “
Thanks again,
Ranjan
On 3/9/14, 11:33 PM, "Sen, Ranjan [USA]" wrote:
>Hi Patrick
>
>How do I know which part of the code is in the driver and which in task?
>The structure of my code is as below-
>
>Š
>
>S
Hi Patrick
How do I know which part of the code is in the driver and which in task?
The structure of my code is as below-
Š
Static boolean done=false;
Š
Public static void main(..
..
JavaRDD lines = ..
..
While (!done) {
..
While (..) {
JavaPairRDD> labs1 = labs.map (new PairFunction<Š );