涂小刚 Mon, 07 Nov 2016 23:16:13 -0800
Hi, all, I run a spark-streaming application, but the ui showed that the active tasks was bigger than cores.
According to my knowledge of spark, one task occupys one core when "spark.task.cpus" is set 1. Some places I understand wrong?