Its a typesafe jar conflict, you will need to put the jar with getDuration
method in the first position of your classpath.

Thanks
Best Regards

On Wed, Dec 31, 2014 at 4:38 PM, Christophe Billiard <
christophe.billi...@gmail.com> wrote:

> Hi all,
>
> I am currently trying to combine datastax's "spark-cassandra-connector" and
> typesafe's "akka-http-experimental"
> on Spark 1.1.1 (spark-cassandra-connector for Spark 1.2.0 not out yet) and
> scala 2.10.4
> I am using the hadoop 2.4 pre built package. (build.sbt file at the end)
>
> To solve the java.lang.NoClassDefFoundError:
> com/datastax/spark/connector/mapper/ColumnMapper
> and other NoClassDefFoundErrors, I have to give some jars to Spark
> (build.sbt is not enough).
> The connectors works fine.
>
> My spark submit looks like:
> sbt clean package; bin/spark-submit   --class "SimpleAppStreaming3"
> --master local[*]  --jars
>
> "spark-cassandra-connector_2.10-1.1.0.jar","cassandra-driver-core-2.1.3.jar","cassandra-thrift-2.0.5.jar","joda-time-2.6.jar"
> target/scala-2.10/simple-project_2.10-1.0.jar
>
> Then I am trying to add some akka-http/akka-stream features.
> Like before I get a java.lang.NoClassDefFoundError:
> akka/stream/FlowMaterializer$
> Same solution, I begin to add jars.
>
> Now my spark submit looks like:
> sbt clean package; bin/spark-submit   --class impleAppStreaming3"
>  --master
> local[*]  --jars
>
> "spark-cassandra-connector_2.10-1.1.0.jar","cassandra-driver-core-2.1.3.jar","cassandra-thrift-2.0.5.jar","joda-time-2.6.jar","akka-stream-experimental_2.10-1.0-M2.jar"
> target/scala-2.10/simple-project_2.10-1.0.jar
>
> Then I have a new kind of error:
> Exception in thread "main" java.lang.NoSuchMethodError:
>
> com.typesafe.config.Config.getDuration(Ljava/lang/String;Ljava/util/concurrent/TimeUnit;)J
>         at
>
> akka.stream.StreamSubscriptionTimeoutSettings$.apply(FlowMaterializer.scala:256)
>         at
> akka.stream.MaterializerSettings$.apply(FlowMaterializer.scala:185)
>         at
> akka.stream.MaterializerSettings$.apply(FlowMaterializer.scala:172)
>         at
> akka.stream.FlowMaterializer$$anonfun$1.apply(FlowMaterializer.scala:42)
>         at
> akka.stream.FlowMaterializer$$anonfun$1.apply(FlowMaterializer.scala:42)
>         at scala.Option.getOrElse(Option.scala:120)
>         at akka.stream.FlowMaterializer$.apply(FlowMaterializer.scala:42)
>         at SimpleAppStreaming3$.main(SimpleAppStreaming3.scala:240)
>         at SimpleAppStreaming3.main(SimpleAppStreaming3.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at
> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:329)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
> I can't get rid of this error.
> I tried:
> 1) adding several jars (including "config-1.2.1.jar")
> 2) studying the dependency tree (with
> https://github.com/jrudolph/sbt-dependency-graph)
> 3) excluding libraryDependencies (with dependencyOverrides)
>
> Any ideas?
>
> Bonus question: Is there a way to avoid adding all these jars with --jars?
>
> *My build.sbt file*
>
> name := "Simple Project"
>
> version := "1.0"
>
> scalaVersion := "2.10.4"
>
> libraryDependencies += "org.apache.spark" %% "spark-core" % "1.1.1"
> //exclude("com.typesafe", "config")
>
> libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.1.1"
>
> libraryDependencies += "com.datastax.cassandra" % "cassandra-driver-core" %
> "2.1.3"
>
> libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector"
> %
> "1.1.0" withSources() withJavadoc()
>
> libraryDependencies += "org.apache.cassandra" % "cassandra-thrift" %
> "2.0.5"
>
> libraryDependencies += "joda-time" % "joda-time" % "2.6"
>
>
>
> libraryDependencies += "com.typesafe.akka" %% "akka-actor"      % "2.3.8"
>
> libraryDependencies += "com.typesafe.akka" %% "akka-testkit"    % "2.3.8"
>
> libraryDependencies += "org.apache.hadoop" %  "hadoop-client"   % "2.4.0"
>
> libraryDependencies += "ch.qos.logback"    %  "logback-classic" % "1.1.2"
>
> libraryDependencies += "org.mockito"       %  "mockito-all"     % "1.10.17"
>
> libraryDependencies += "org.scalatest"     %% "scalatest"       % "2.2.3"
>
> libraryDependencies += "org.slf4j"         %  "slf4j-api"       % "1.7.5"
>
> libraryDependencies += "org.apache.spark"  %% "spark-streaming" % "1.1.1"
>
>
> libraryDependencies += "com.typesafe.akka" %% "akka-stream-experimental"
> % "1.0-M2"
>
> libraryDependencies += "com.typesafe.akka" %% "akka-http-experimental"
> % "1.0-M2"
>
> libraryDependencies += "com.typesafe.akka" %% "akka-http-core-experimental"
> % "1.0-M2"
>
>
> libraryDependencies += "com.typesafe" % "config" % "1.2.1"
>
> dependencyOverrides += "com.typesafe" % "config" % "1.2.1"
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-com-typesafe-config-Config-getDuration-with-akka-http-akka-stream-tp20926.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to