I don’t think it possible as Spark does not support thread to CPU affinity.
> On Aug 4, 2016, at 14:27, sujeet jog <sujeet....@gmail.com> wrote:
> 
> Is there a way we can run multiple tasks concurrently on a single core in 
> local mode.
> 
> for ex :- i have 5 partition ~ 5 tasks, and only a single core , i want these 
> tasks to run concurrently, and specifiy them to use /run on a single core. 
> 
> The machine itself is say 4 core, but i want to utilize only 1 core out of 
> it,. 
> 
> Is it possible ?
> 
> Thanks, 
> Sujeet
> 



---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to