On Fri, Oct 28, 2022 at 12:10 AM Shrikant Prasad
> wrote:
>
>> Thanks Dongjoon for replying. I have tried with Spark 3.2 and still
>> facing the same issue.
>>
>> Looking for some pointers which can help in debugging to find the
>> root cause.
>>
>> Re
like Apache Spark 3.3.1 to see that
> you are still experiencing the same issue?
>
> It will reduce the scope of your issues by excluding many known and fixed
> bugs at 3.0/3.1/3.2/3.3.0.
>
> Thanks,
> Dongjoon.
>
>
> On Wed, Oct 26, 2022 at 11:16 PM Shrikant Pras
utors. Also, why the
context will be shutdown in this case instead of retrying with new
executors.
Another doubt is why the driver pod gets deleted. Shouldn't it just error
out?
Regards,
Shrikant
--
Regards,
Shrikant Prasad
e cannot introduce 1 new "default" config for every array-like
> config.
>
> I wanted to know if others have experienced this issue and what systems
> have been implemented to tackle this. Are there any existing solutions for
> this; either client-side or server-side? (e.g. at job submission server).
> Even though we cannot easily enforce this at the client-side, the
> simplicity of a solution may make it more appealing.
>
> Thanks,
> Shardul
>
> --
Regards,
Shrikant Prasad