Yes, the documentation is actually a little outdated. We will get around to
fix it shortly. Please use --driver-cores or --executor-cores instead.
2014-07-14 19:10 GMT-07:00 cjwang :
> Neither do they work in new 1.0.1 either
>
>
>
> --
> View this message in context:
> http://apache-spark-user-
Neither do they work in new 1.0.1 either
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/cores-option-in-spark-shell-tp6809p9690.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
That used to work with version 0.9.1 and earlier and does not seem to work
with 1.0.0.
M.
2014-06-03 17:53 GMT+02:00 Mikhail Strebkov :
> Try -c instead, works for me, e.g.
>
> bin/spark-shell -c 88
>
>
>
> On Tue, Jun 3, 2014 at 8:15 AM, Marek Wiewiorka > wrote:
>
>> Hi All,
>> there is inf
Try -c instead, works for me, e.g.
bin/spark-shell -c 88
On Tue, Jun 3, 2014 at 8:15 AM, Marek Wiewiorka
wrote:
> Hi All,
> there is information in 1.0.0 Spark's documentation that
> there is an option "--cores" that one can use to set the number of cores
> that spark-shell uses on the clust
I havent been able to set the cores with that option in Spark 1.0.0 either.
To work around that, setting the environment variable:
SPARK_JAVA_OPTS="-Dspark.cores.max=" seems to do the trick.
Matt Kielo
Data Scientist
Oculus Info Inc.
On Tue, Jun 3, 2014 at 11:15 AM, Marek Wiewiorka
wrote:
> Hi