Spark distribute loads to executors and the executors are usually
pre-configured with the number of cores. You may want to check with
your Spark admin on how many executors (or slaves) your Spark cluster is
configured with and how many cores are pre-configured for executors.
The debugging too
We're using Spark 2.4. We recently pushed to production a product that's
using Spark Structured Streaming. It's working well most of the time but
occasionally, when the load is high, we've noticed that there are only 10+
'Active Tasks' even though we've provided 128 cores. Would like to debug
this