This is a case where resources are fixed in the same SparkContext, but sqls
have different priorities.
Some SQLs are only allowed to be executed if there are spare resources,
once the high priority sql comes in, those sqls taskset either are killed
or stalled.
If we set a high priority pool's mi
From: Qian SUN
Sent: Wednesday, May 18, 2022 9:32
To: Bowen Song
Cc: user.spark
Subject: Re: A scene with unstable Spark performance
Hi. I think you need Spark dynamic resource allocation. Please refer to
https://spark.apache.org/docs/latest/job-scheduling.html#dynamic