…not sure when will it be reviewed…

but for now you can work around by allowing multiple worker instances on a 
single machine  

http://spark.apache.org/docs/latest/spark-standalone.html

search SPARK_WORKER_INSTANCES

Best,  

--  
Nan Zhu
http://codingcat.me


On Wednesday, January 21, 2015 at 6:50 PM, Larry Liu wrote:

> Will  SPARK-1706 be included in next release?
>  
> On Wed, Jan 21, 2015 at 2:50 PM, Ted Yu <yuzhih...@gmail.com 
> (mailto:yuzhih...@gmail.com)> wrote:
> > Please see SPARK-1706
> >  
> > On Wed, Jan 21, 2015 at 2:43 PM, Larry Liu <larryli...@gmail.com 
> > (mailto:larryli...@gmail.com)> wrote:
> > > I tried to submit a job with  --conf "spark.cores.max=6"  or 
> > > --total-executor-cores 6 on a standalone cluster. But I don't see more 
> > > than 1 executor on each worker. I am wondering how to use multiple 
> > > executors when submitting jobs.
> > >  
> > > Thanks
> > > larry
> > >  
> > >  
> >  
> >  
> >  
>  

Reply via email to