My error was related to Scala version. Upon further reading, I realized
that it takes some effort to get Spark working with Scala 2.11.
I've reverted to using 2.10 and moved past that error. Now I hit the issue
you mentioned. Waiting for 1.4.1.

Srikanth

On Fri, Jun 26, 2015 at 9:10 AM, Roberto Coluccio <
roberto.coluc...@gmail.com> wrote:

> I got a similar issue. Might your as well be related to this
> https://issues.apache.org/jira/browse/SPARK-8368 ?
>
> On Fri, Jun 26, 2015 at 2:00 PM, Akhil Das <ak...@sigmoidanalytics.com>
> wrote:
>
>> Those provided spark libraries are compatible with scala 2.11?
>>
>> Thanks
>> Best Regards
>>
>> On Fri, Jun 26, 2015 at 4:48 PM, Srikanth <srikanth...@gmail.com> wrote:
>>
>>> Thanks Akhil for checking this out. Here is my build.sbt.
>>>
>>> name := "Weblog Analysis"
>>>
>>> version := "1.0"
>>>
>>> scalaVersion := "2.11.5"
>>>
>>> javacOptions ++= Seq("-source", "1.7", "-target", "1.7")
>>>
>>> libraryDependencies ++= Seq(
>>>   "org.apache.spark" %% "spark-core" % "1.4.0" % "provided",
>>>   "org.apache.spark" %% "spark-sql" % "1.4.0",
>>>   "org.apache.spark" %% "spark-streaming" % "1.4.0",
>>>   "org.apache.spark" %% "spark-streaming-kafka" % "1.4.0",
>>>   "org.apache.spark" %% "spark-mllib" % "1.4.0",
>>>   "org.apache.commons" % "commons-lang3" % "3.0",
>>>   "org.eclipse.jetty"  % "jetty-client" % "8.1.14.v20131031",
>>>   "org.scalatest" %% "scalatest" % "2.2.1" % "test",
>>>   "com.databricks" % "spark-csv_2.11" % "1.0.3",
>>>   "joda-time" % "joda-time" % "2.8.1",
>>>   "org.joda"  % "joda-convert" % "1.7"
>>> )
>>>
>>> resolvers ++= Seq(
>>>   "Sonatype OSS Snapshots"  at "
>>> http://oss.sonatype.org/content/repositories/snapshots/";,
>>>   "Sonatype public"         at "
>>> http://oss.sonatype.org/content/groups/public/";,
>>>   "Sonatype"     at "
>>> http://nexus.scala-tools.org/content/repositories/public";,
>>>   "Scala Tools"             at "http://scala-tools.org/repo-snapshots/";,
>>>   "Typesafe"                at "
>>> http://repo.typesafe.com/typesafe/releases/";,
>>>   "Akka"         at "http://akka.io/repository/";,
>>>   "JBoss"                   at "
>>> http://repository.jboss.org/nexus/content/groups/public/";,
>>>   "GuiceyFruit" at "http://guiceyfruit.googlecode.com/svn/repo/releases/
>>> "
>>> )
>>>
>>> On Fri, Jun 26, 2015 at 4:13 AM, Akhil Das <ak...@sigmoidanalytics.com>
>>> wrote:
>>>
>>>> Its a scala version conflict, can you paste your build.sbt file?
>>>>
>>>> Thanks
>>>> Best Regards
>>>>
>>>> On Fri, Jun 26, 2015 at 7:05 AM, stati <srikanth...@gmail.com> wrote:
>>>>
>>>>> Hello,
>>>>>
>>>>> When I run a spark job with spark-submit it fails with below exception
>>>>> for
>>>>> code line
>>>>>        /*val webLogDF = webLogRec.toDF().select("ip", "date",
>>>>> "name"")*/
>>>>>
>>>>> I had similar issue running from spark-shell, then realized that I
>>>>> needed
>>>>> sqlContext.implicit._
>>>>> Now my code has the following imports
>>>>> /*
>>>>>      import org.apache.spark._
>>>>>      import org.apache.spark.sql._
>>>>>      import org.apache.spark.sql.functions._
>>>>>      val sqlContext = new SQLContext(sc)
>>>>>      import sqlContext.implicits._
>>>>> */
>>>>>
>>>>> Code works fine from spark-shell REPL. It also runs fine when run in
>>>>> local
>>>>> mode from Eclipse. I get this
>>>>> error only when I submit to cluster using spark-submit.
>>>>> bin/spark-submit /local/weblog-analysis_2.11-1.0.jar --class
>>>>> WebLogAnalysis
>>>>> --master spark://machu:7077
>>>>>
>>>>> I'm testing with spark 1.4. My code was built using scala 2.11 and
>>>>> spark+sparkSQL 1.4.0 as dependency in build.sbt
>>>>>
>>>>> Exception in thread "main" java.lang.NoSuchMethodError:
>>>>>
>>>>> scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror;
>>>>>         at WebLogAnalysis$.readWebLogFiles(WebLogAnalysis.scala:38)
>>>>>         at WebLogAnalysis$.main(WebLogAnalysis.scala:62)
>>>>>         at WebLogAnalysis.main(WebLogAnalysis.scala)
>>>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>         at
>>>>>
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>         at
>>>>>
>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>         at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>         at
>>>>>
>>>>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
>>>>>         at
>>>>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
>>>>>         at
>>>>> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
>>>>>         at
>>>>> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
>>>>>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>>>
>>>>> I can provide more code or log if that will help. Let me know.
>>>>>
>>>>> Srikanth
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> View this message in context:
>>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-4-RDD-to-DF-fails-with-toDF-tp23499.html
>>>>> Sent from the Apache Spark User List mailing list archive at
>>>>> Nabble.com.
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>>
>>>>>
>>>>
>>>
>>
>

Reply via email to