The default value of spark.dynamicAllocation.shuffleTracking.enabled was 
changed from false to true in Spark 3.4.0, disabling it might help.

[1] 
https://spark.apache.org/docs/latest/core-migration-guide.html#upgrading-from-core-33-to-34

Thanks,
Cheng Pan



> On Sep 6, 2024, at 00:36, Jayabindu Singh <jayabi...@gmail.com> wrote:
> 
> Dear Spark Users,
> 
> We have run into an issue where with spark 3.3.2 using auto scaling with STS 
> is working fine, but with 3.4.2 or 3.5.2 executors are being left behind and 
> not scaling down. 
> Driver makes a call to remove the executor but some (not all) executors never 
> get removed.
> 
> Has anyone else noticed this or aware of any reported issues?
> 
> Any help will be greatly appreciated.
> 
> Regards
> Jay
> 


---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to