Also check the web ui for that. Each iteration will have one or more stages
associated with it in the driver web ui.
On Sat, Jul 12, 2014 at 6:47 PM, crater wrote:
> Hi Xiangrui,
>
> Thanks for the information. Also, it is possible to figure out the
> execution
> time per iteration for SVM?
>
>
Hi Xiangrui,
Thanks for the information. Also, it is possible to figure out the execution
time per iteration for SVM?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Putting-block-rdd-failed-when-running-example-svm-on-large-data-tp9515p9535.html
Sent from
By default, Spark uses half of the memory for caching RDDs
(configurable by spark.storage.memoryFraction). That is about 25 * 8 /
2 = 100G for your setup, which is smaller than the 202G data size. So
you don't have enough memory to fully cache the RDD. You can confirm
it in the storage tab of the W