Jobs consist of tasks, each of which consumes a core (can be set to >1 too,
but that's a different story). If there are more tasks ready to execute
than available cores, some tasks simply wait.

On Sun, Jul 10, 2022 at 3:31 AM Yong Walt <yongw...@gmail.com> wrote:

> given my spark cluster has 128 cores totally.
> If the jobs (each job was assigned only one core) I submitted to the
> cluster are over 128, what will happen?
>
> Thank you.
>

Reply via email to