I specified spark,cores.max = 4
but it started 2 executors with 2 cores each on each of the 2 workers.
in standalone cluster mode, though we can specify Worker cores, there is no
ways to specify Number of cores executor must take on that particular
worker machine.
On Sat, Oct 24, 2015 at 1:41
How did you specify number of cores each executor can use?
Be sure to use this when submitting jobs with spark-submit:
*--total-executor-cores
100.*
Other options won't work from my experience.
On Fri, Oct 23, 2015 at 8:36 AM, gaurav sharma
wrote:
> Hi,
>
> I created 2 workers on same machine
Hi,
I created 2 workers on same machine each with 4 cores and 6GB ram
I submitted first job, and it allocated 2 cores on each of the worker
processes, and utilized full 4 GB ram for each executor process
When i submit my second job it always say in WAITING state.
Cheers!!
On Tue, Oct 20, 20
You can set the max cores for the first submitted job such that it does not
take all the resources from the master. See
http://spark.apache.org/docs/latest/submitting-applications.html
# Run on a Spark standalone cluster in client deploy mode
./bin/spark-submit \
--class org.apache.spark.example
Hi All,
Would it be possible to run multiple spark streaming jobs on a single
master at the same time?
I currently have one master node and several worker nodes in the standalone
mode, and I used spark-submit to submit multiple spark streaming jobs.
>From what I observed, it seems like only the