Thanks,
Since i'm running in local mode, i plan to pin down the JVM to a CPU with
taskset -cp , hopefully with this all the tasks should operate
on the specified CPU cores..
Thanks,
Sujeet
On Thu, Aug 4, 2016 at 8:11 PM, Daniel Darabos <
daniel.dara...@lynxanalytics.com> wrote:
> You could r
You could run the application in a Docker container constrained to one CPU
with --cpuset-cpus (
https://docs.docker.com/engine/reference/run/#/cpuset-constraint).
On Thu, Aug 4, 2016 at 8:51 AM, Sun Rui wrote:
> I don’t think it possible as Spark does not support thread to CPU affinity.
> > On A
I don’t think it possible as Spark does not support thread to CPU affinity.
> On Aug 4, 2016, at 14:27, sujeet jog wrote:
>
> Is there a way we can run multiple tasks concurrently on a single core in
> local mode.
>
> for ex :- i have 5 partition ~ 5 tasks, and only a single core , i want these
Is there a way we can run multiple tasks concurrently on a single core in
local mode.
for ex :- i have 5 partition ~ 5 tasks, and only a single core , i want
these tasks to run concurrently, and specifiy them to use /run on a single
core.
The machine itself is say 4 core, but i want to utilize on