I don't believe there is, directly, though there is ClusteringMetrics to evaluate clusterings in .ml. I'm kinda confused that it doesn't expose sum of squared distances though; it computes silhouette only? You can compute it directly, pretty easily, in any event, either by just writing up a few lines of code or using the .mllib model inside the .ml model object anyway.
On Mon, Nov 29, 2021 at 2:50 PM Artemis User <arte...@dtechspace.com> wrote: > The RDD-based org.apache.spark.mllib.clustering.KMeansModel class > defines a method called computeCost that is used to calculate the WCSS > error of K-Means clusters > ( > https://spark.apache.org/docs/latest/api/scala/org/apache/spark/mllib/clustering/KMeansModel.html). > > Is there an equivalent method of computeCost in the new ml library for > K-Means? > > Thanks in advance! > > -- ND > > --------------------------------------------------------------------- > To unsubscribe e-mail: user-unsubscr...@spark.apache.org > >