Hi,I'm using MLLib to train a random forest. It's working fine to depth 15,
but if I use depth 20 I get a*java.lang.OutOfMemoryError: Requested array
size exceeds VM limit* on the driver, from the collectAsMap operation in
DecisionTree.scala, around line 642.It doesn't happen until a good hour into
training. I'm using 50 treees on 36 slaves with maxMemoryInMB=250, but still
get an error even if I use a driver memory of 240G. Has anybody seen this
error in this context before, and can advise on what might be triggering
it?Best,Luke



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Requested-array-size-exceeds-VM-limit-tp21774.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to