Hi All,

Sorry reposting this again in the hope to get some clues.

Best Regards,
Sonal
Nube Technologies <http://www.nubetech.co>

<http://in.linkedin.com/in/sonalgoyal>




On Wed, Aug 13, 2014 at 3:53 PM, Sonal Goyal <sonalgoy...@gmail.com> wrote:

> Hi,
>
> I am trying to run and test some graph apis using Java. I started with
> connected components, here is my code.
>
> JavaRDD<Edge<Long>> vertices;
> ///code to populate vertices
> ..
> ..
> ClassTag<Long> longTag = scala.reflect.ClassTag$.MODULE$.apply(Long.class);
> ClassTag<Float> floatTag =
> scala.reflect.ClassTag$.MODULE$.apply(Float.class);
> Graph<Float,Long> graph = Graph.fromEdges(vertices.rdd(), 1.0F,
> floatTag,longTag);
> VertexRDD<Object> cc = graph.ops().connectedComponents().vertices();
>
>
> Running the code gives me the following exception. It seems an  array is
> getting populated with the wrong type, but I am out of depth out my limited
> Scala knowledge
> to be able to pinpoint whats going wrong.
>
> java.lang.ArrayStoreException: java.lang.Integer
>     at scala.runtime.ScalaRunTime$.array_update(ScalaRunTime.scala:88)
>     at
> org.apache.spark.graphx.impl.EdgePartitionBuilder.toEdgePartition(EdgePartitionBuilder.scala:54)
>     at
> org.apache.spark.graphx.impl.GraphImpl$$anonfun$23.apply(GraphImpl.scala:321)
>     at
> org.apache.spark.graphx.impl.GraphImpl$$anonfun$23.apply(GraphImpl.scala:316)
>     at org.apache.spark.rdd.RDD$$anonfun$13.apply(RDD.scala:569)
>     at org.apache.spark.rdd.RDD$$anonfun$13.apply(RDD.scala:569)
>     at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)
>     at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262)
>     at org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:77)
>     at org.apache.spark.rdd.RDD.iterator(RDD.scala:227)
>     at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)
>     at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262)
>     at org.apache.spark.rdd.RDD.iterator(RDD.scala:229)
>     at
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:158)
>     at
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
>     at org.apache.spark.scheduler.Task.run(Task.scala:51)
>     at
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:187)
>     at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>     at java.lang.Thread.run(Thread.java:745)
>
>
> Would appreciate any help or suggestions on how to get this working.
>
> Best Regards,
> Sonal
> Nube Technologies <http://www.nubetech.co>
>
> <http://in.linkedin.com/in/sonalgoyal>
>
>
>

Reply via email to