I am a Spark newbie and I use python (pyspark). I am trying to run a program on a 64 core system, but no matter what I do, it always use 1 core. It doesn't matter if I run it using "spark-submit --master local[64] run.sh" or I call x.repartition(64) in my code with an RDD, the spark program always use one core. Has anyone experience of running spark programs on multicore processors with success? Can someone provide me a very simple example that does properly run on all cores of a multicore system?
-- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Do-spark-works-on-multicore-systems-tp18419.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org