Hello, Is there a way to tell Spark that PySpark (arrow) functions use multiple cores? If we have an executor with 8 cores, we would like to have a single PySpark function use all 8 cores instead of having 8 single core python functions run.
Thanks! Andrew --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org