Hello!
Thank you very much for your response. In the book "Learning Spark" I
found out the following sentence:
"Each application will have at most one executor on each worker"
So worker can have one or none executor process spawned (perhaps the number
depends on the workload distribution).
Be
Hi Spico,
Yes, I think an "executor core" in Spark is basically a thread in a worker
pool. It's recommended to have one executor core per physical core on your
machine for best performance, but I think in theory you can create as many
threads as your OS allows.
For deployment:
There seems to be t