ough 25
> processes on a single node seem too high)
>
>
>
> *From: *Sachit Murarka
> *Date: *Tuesday, October 13, 2020 at 8:15 AM
> *To: *spark users
> *Subject: *RE: [EXTERNAL] Multiple applications being spawned
>
>
>
> *CAUTION*: This email originated fr
seeing is somewhat expected, (although 25 processes on a
single node seem too high)
From: Sachit Murarka
Date: Tuesday, October 13, 2020 at 8:15 AM
To: spark users
Subject: RE: [EXTERNAL] Multiple applications being spawned
CAUTION: This email originated from outside of the organization. Do not
Adding Logs.
When it launches the multiple applications , following logs get generated
on the terminal
Also it retries the task always:
20/10/13 12:04:30 WARN TaskSetManager: Lost task XX in stage XX (TID XX,
executor 5): java.net.SocketException: Broken pipe (Write failed)
at java.net.So
Hi Users,
When action(I am using count and write) gets executed in my spark job , it
launches many more application instances(around 25 more apps).
In my spark code , I am running the transformations through Dataframes
then converting dataframe to rdd then applying zipwithindex , then
converting