Hi,
I have a simple DecisionForest model and was able to train the model on
pyspark==2.4.4 without any issues.
However, when I upgraded to pyspark==3.0.2, the fit takes a lot of time and
eventually errors out saying out of memory. Even tried reducing the number
of samples for training but no luck.
Can anyone help with this?

Best,
Praneeth




--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to