Hi,

I wonder if the pagerank implementation is correct. More specifically, I
look at the following function from  PageRank.scala
<https://github.com/apache/spark/blob/master/graphx/src/main/scala/org/apache/spark/graphx/lib/PageRank.scala>
  
, which is given to Pregel:

    def vertexProgram(id: VertexId, attr: (Double, Double), msgSum: Double):
(Double, Double) = {
      val (oldPR, lastDelta) = attr
      val newPR = oldPR + (1.0 - resetProb) * msgSum
      (newPR, newPR - oldPR)
    }

This line: val newPR = oldPR + (1.0 - resetProb) * msgSum
makes no sense to me. Should it not be:
val newPR = resetProb/graph.vertices.count() + (1.0 - resetProb) * msgSum 
?

Background: I wanted to implement pagerank with the damping factor (here:
resetProb) divided by the number of nodes in the graph.

Tom



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Pagerank-implementation-tp19013.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to