Have you tried the following ? import sqlContext._ import sqlContext.implicits._
Cheers On Tue, Apr 21, 2015 at 7:54 AM, Wang, Ningjun (LNG-NPV) < ningjun.w...@lexisnexis.com> wrote: > I tried to convert an RDD to a data frame using the example codes on > spark website > > > > > > case class Person(name: String, age: Int) > > > > val sqlContext = new org.apache.spark.sql.SQLContext(sc) > import sqlContext.implicits._ > > val people = > sc.textFile("examples/src/main/resources/people.txt").map(_.split(",")).map(p > => *Person*(p(0), p(1).trim.toInt)).toDF() > > > > It compile fine using sbt. But in IntelliJ IDEA 14.0.3, it fail to compile > and generate the following error > > > > Error:(289, 23) value *implicits is not a member of > org.apache.spark.sql.SQLContext* > > import sqlContext.implicits._ > > ^ > > > > What is the problem here? Is there any work around (e.g. do explicit > conversion)? > > > > Here is my build.sbt > > > > name := "my-project" > > > > version := "0.2" > > > > scalaVersion := "2.10.4" > > > > val sparkVersion = "1.3.1" > > > > val luceneVersion = "4.10.2" > > > > libraryDependencies <<= scalaVersion { > > scala_version => Seq( > > "org.apache.spark" %% "spark-core" % sparkVersion % "provided", > > "org.apache.spark" %% "spark-mllib" % sparkVersion % "provided", > > "spark.jobserver" % "job-server-api" % "0.4.1" % "provided", > > "org.scalatest" %% "scalatest" % "2.2.1" % "test" > > ) > > } > > resolvers += "Spark Packages Repo" at " > http://dl.bintray.com/spark-packages/maven" > > resolvers += Resolver.mavenLocal > > > > > > Ningjun > > >