I think you are looking for the spark.task.cpus configuration parameter.

Op vr 29 jul. 2022 om 07:41 schreef Andrew Melo <andrew.m...@gmail.com>:

> Hello,
>
> Is there a way to tell Spark that PySpark (arrow) functions use
> multiple cores? If we have an executor with 8 cores, we would like to
> have a single PySpark function use all 8 cores instead of having 8
> single core python functions run.
>
> Thanks!
> Andrew
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to