Hi,

In spark, there are two settings regarding number of cores, one is at task
level :spark.task.cpus

and there is another one, which drives number of cores per executors:
spark.executor.cores

Apart from using more than one core for a task which has to call some other
external API etc, is there any other use case / benefit of assigning more
than one core to a task?

As per the code, I can only see this being used while scheduling etc , as
such RDD partitions etc remains untouched from this setting. Does this mean
that coder needs to take care of coding the application logic to take care
of this setting? ( which again let me think over this setting ).

Comments please.

Thanks,

Twinkle

Reply via email to