Try adding the following entry inside your conf/spark-defaults.conf file

spark.cores.max 64

Thanks
Best Regards

On Sun, Nov 9, 2014 at 3:50 AM, Blind Faith <person.of.b...@gmail.com>
wrote:

> I am a Spark newbie and I use python (pyspark). I am trying to run a
> program on a 64 core system, but no matter what I do, it always uses 1
> core. It doesn't matter if I run it using "spark-submit --master local[64]
> run.sh" or I call x.repartition(64) in my code with an RDD, the spark
> program always uses one core. Has anyone experience of running spark
> programs on multicore processors with success? Can someone provide me a
> very simple example that does properly run on all cores of a multicore
> system?
>

Reply via email to