Hi,

I was running a WordCount application on Spark, and the machine I used has
4 physical cores. However, in spark-env.sh file, I set  SPARK_WORKER_CORES
= 32. The web UI says it launched one executor with 32 cores and the
executor could execute 32 tasks simultaneously. Does spark create 32 vCores
out of 4 physical cores? How much physical CPU resource can each task get
then?

Also, I found a parameter “spark.task.cpus”, but I don’t quite understand
this parameter. If I set it to 2, does Spark allocate 2 CPU cores for one
task? I think “task” is a “thread” within executor (“process”), so how can
a thread utilize two CPU cores simultaneously?

I am looking forward to your reply, thanks!

Best,
Rui

Reply via email to