>
> *From:* Bryan Cutler [mailto:cutl...@gmail.com]
> *Sent:* Thursday, January 14, 2016 2:19 PM
> *To:* Rachana Srivastava
> *Cc:* u...@spark.apache.org; dev@spark.apache.org
> *Subject:* Re: Random Forest FeatureImportance throwing
> NullPointerException
>
>
>
> Hi Rac
Hi Rachana,
I got the same exception. It is because computing the feature importance
depends on impurity stats, which is not calculated with the old
RandomForestModel in MLlib. Feel free to create a JIRA for this if you
think it is necessary, otherwise I believe this problem will be eventually
s
Tried using 1.6 version of Spark that takes numberOfFeatures fifth argument in
the API but still getting featureImportance as null.
RandomForestClassifier rfc = getRandomForestClassifier( numTrees, maxBinSize,
maxTreeDepth, seed, impurity);
RandomForestClassificationModel rfm =
RandomFores