That's all very old functionality in Spark terms, so it shouldn't have
anything to do with your installation being out-of-date.  There is also no
need to cast as long as the relevant implicit conversions are in scope:
import org.apache.spark.SparkContext._


On Tue, May 20, 2014 at 1:00 PM, GlennStrycker <glenn.stryc...@gmail.com>wrote:

> I don't seem to have this function in my Spark installation for this
> object,
> or the classes MappedRDD, FlatMappedRDD, EdgeRDD, VertexRDD, or Graph.
>
> Which class should have the reduceByKey function, and how do I cast my
> current RDD as this class?
>
> Perhaps this is still due to my Spark installation being out-of-date?
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/BUG-graph-triplets-does-not-return-proper-values-tp6693p6728.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>

Reply via email to