Re: ---cores option in spark-shell

2014-07-14 Thread Andrew Or
p://apache-spark-user-list.1001560.n3.nabble.com/cores-option-in-spark-shell-tp6809p9690.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. >

Re: ---cores option in spark-shell

2014-07-14 Thread cjwang
Neither do they work in new 1.0.1 either -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/cores-option-in-spark-shell-tp6809p9690.html Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: ---cores option in spark-shell

2014-06-03 Thread Marek Wiewiorka
That used to work with version 0.9.1 and earlier and does not seem to work with 1.0.0. M. 2014-06-03 17:53 GMT+02:00 Mikhail Strebkov : > Try -c instead, works for me, e.g. > > bin/spark-shell -c 88 > > > > On Tue, Jun 3, 2014 at 8:15 AM, Marek Wiewiorka > wrote: > >> Hi All, >> there is inf

Re: ---cores option in spark-shell

2014-06-03 Thread Mikhail Strebkov
Try -c instead, works for me, e.g. bin/spark-shell -c 88 On Tue, Jun 3, 2014 at 8:15 AM, Marek Wiewiorka wrote: > Hi All, > there is information in 1.0.0 Spark's documentation that > there is an option "--cores" that one can use to set the number of cores > that spark-shell uses on the clust

Re: ---cores option in spark-shell

2014-06-03 Thread Matt Kielo
I havent been able to set the cores with that option in Spark 1.0.0 either. To work around that, setting the environment variable: SPARK_JAVA_OPTS="-Dspark.cores.max=" seems to do the trick. Matt Kielo Data Scientist Oculus Info Inc. On Tue, Jun 3, 2014 at 11:15 AM, Marek Wiewiorka wrote: > Hi

---cores option in spark-shell

2014-06-03 Thread Marek Wiewiorka
Hi All, there is information in 1.0.0 Spark's documentation that there is an option "--cores" that one can use to set the number of cores that spark-shell uses on the cluster: You can also pass an option --cores to control the number of cores that spark-shell uses on the cluster. This option doe