Hello, I am running SparkPageRank example which uses cache() API for persistence. This AFAIK, uses MEMORY_ONLY storage level. But even in this setup, I see a lot of "WARN ExternalAppendOnlyMap: Spilling in-memory map of...." messages in the log. Why is it so? I thought that MEMORY_ONLY means kick out the RDD if there isn't enough memory available.
Thanks, Lokesh -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spilling-in-memory-messages-in-log-even-with-MEMORY-ONLY-tp10723.html Sent from the Apache Spark User List mailing list archive at Nabble.com.