Basically, here's a dump of the SO question I opened
(http://stackoverflow.com/questions/31033724/spark-1-4-0-java-lang-nosuchmethoderror-com-google-common-base-stopwatch-elapse)

I'm using spark 1.4.0 and when running the Scala SparkPageRank example
(*examples/src/main/scala/org/apache/spark/examples/SparkPageRank.scala*), I
encounter the following error:

    Exception in thread "main" java.lang.NoSuchMethodError:
com.google.common.base.Stopwatch.elapsedMillis()J
        at
org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:245)
        at
org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:313)
        at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:207)
        at
org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
        at
org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
        at scala.Option.getOrElse(Option.scala:120)
        at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
        at
org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:32)
        at
org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
        at
org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
        at scala.Option.getOrElse(Option.scala:120)
        at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
        at
org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:32)
        at
org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
        at
org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
        at scala.Option.getOrElse(Option.scala:120)
        at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
        at org.apache.spark.rdd.RDD$$anonfun$distinct$2.apply(RDD.scala:329)
        at org.apache.spark.rdd.RDD$$anonfun$distinct$2.apply(RDD.scala:329)
        at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
        at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
        at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
        at org.apache.spark.rdd.RDD.distinct(RDD.scala:328)
        at
org.apache.spark.examples.SparkPageRank$.main(SparkPageRank.scala:60)
        at org.apache.spark.examples.SparkPageRank.main(SparkPageRank.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:621)
        at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
        at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)


I'm not extremely familiar with java, but it seems that it's a [guava][1]
version issue

The following information could be helpfup:

    $ find ./spark -name *.jars | grep guava
    ./lib_managed/bundles/guava-16.0.1.jar
    ./lib_managed/bundles/guava-14.0.1.jar

part of the examples/pom.xml file: 

    ...
     <dependency>
          <groupId>org.apache.cassandra</groupId>
          <artifactId>cassandra-all</artifactId>
          <version>1.2.6</version>
          <exclusions>
            <exclusion>
              <groupId>com.google.guava</groupId>
              <artifactId>guava</artifactId>
            </exclusion>
    ...

And indeed it seems that the class does not contain the problematic method:

    $ javap -p
/mnt/spark/examples/target/streams/\$global/assemblyOption/\$global/streams/assembly/7850cb6d36b2a6589a4d27ce027a65a2da72c9df_5fa98cd1a63c99a44dd8d3b77e4762b066a5d0c5/com/google/common/base/Stopwatch.class
    
    Compiled from "Stopwatch.java"
    public final class com.google.common.base.Stopwatch {
      private final com.google.common.base.Ticker ticker;
      private boolean isRunning;
      private long elapsedNanos;
      private long startTick;
      public static com.google.common.base.Stopwatch createUnstarted();
      public static com.google.common.base.Stopwatch
createUnstarted(com.google.common.base.Ticker);
      public static com.google.common.base.Stopwatch createStarted();
      public static com.google.common.base.Stopwatch
createStarted(com.google.common.base.Ticker);
      public com.google.common.base.Stopwatch();
      public
com.google.common.base.Stopwatch(com.google.common.base.Ticker);
      public boolean isRunning();
      public com.google.common.base.Stopwatch start();
      public com.google.common.base.Stopwatch stop();
      public com.google.common.base.Stopwatch reset();
      private long elapsedNanos();
      public long elapsed(java.util.concurrent.TimeUnit);
      public java.lang.String toString();
      private static java.util.concurrent.TimeUnit chooseUnit(long);
      private static java.lang.String
abbreviate(java.util.concurrent.TimeUnit);
    }

I would like to understand better the issue, and if possible learn how to
fix it :-)

  [1]: https://github.com/google/guava



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Exception-in-thread-main-java-lang-NoSuchMethodError-com-google-common-base-Stopwatch-elapsedMillis-J-tp23476.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to