I have the same exception when I run the following example fromSpark SQL Programming Guide - Spark 1.2.0 Documentation
| | | | | | | | | Spark SQL Programming Guide - Spark 1.2.0 DocumentationSpark SQL Programming Guide Overview Getting Started Data Sources RDDs Inferring the Schema Using Reflection Programmatically Specifying the Schema Parquet Files Loading Data Programmatically | | | | View on spark.apache.org | Preview by Yahoo | | | | | But it works fine when I run it from intellij . I am using Spark 1.2 for Scala 2.10.4. sbt 0.13.7. import org.apache.spark.SparkContext // Define the schema using a case class.// Note: Case classes in Scala 2.10 can support only up to 22 fields. To work around this limit,// you can use custom classes that implement the Product interface.case class Person(name: String, age: Int) object SparkSqlDemo { def main(args: Array[String]) { val sc = new SparkContext("local", "SparkSqlTest") // sc is an existing SparkContext. val sqlContext = new org.apache.spark.sql.SQLContext(sc) // createSchemaRDD is used to implicitly convert an RDD to a SchemaRDD. import sqlContext.createSchemaRDD // Create an RDD of Person objects and register it as a table. val people = sc.textFile("src/test/resources/people.txt").map(_.split(",")).map(p => Person(p(0), p(1).trim.toInt)) people.registerTempTable("people") // SQL statements can be run by using the sql methods provided by sqlContext. val teenagers = sqlContext.sql("SELECT name FROM people WHERE age >= 13 AND age <= 19") // The results of SQL queries are SchemaRDDs and support all the normal RDD operations. // The columns of a row in the result can be accessed by ordinal. teenagers.map(t => "Name: " + t(0)).collect().foreach(println) }} Shing On Wednesday, 7 January 2015, 21:55, figpope <drewii...@gmail.com> wrote: I'm on Spark 1.2.0, with Scala 1.11.2, and SBT 0.13.7. When running: case class Test(message: String) val sc = new SparkContext("local", "shell") val sqlContext = new SQLContext(sc) import sqlContext._ val testing = sc.parallelize(List(Test("this"), Test("is"), Test("a"), Test("test"))) testing.saveAsParquetFile("test") I get the following error: scala.ScalaReflectionException: class org.apache.spark.sql.catalyst.ScalaReflection in JavaMirror with java.net.URLClassLoader@64fc9229 of type class java.net.URLClassLoader with classpath [file:/Users/andrew/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.11.2.jar,file:/Users/andrew/.ivy2/cache/org.scala-lang/scala-compiler/jars/scala-compiler-2.11.2.jar,file:/Users/andrew/.ivy2/cache/org.scala-lang/scala-reflect/jars/scala-reflect-2.11.2.jar,file:/Users/andrew/.ivy2/cache/org.scala-lang.modules/scala-xml_2.11/bundles/scala-xml_2.11-1.0.2.jar,file:/Users/andrew/.ivy2/cache/org.scala-lang.modules/scala-parser-combinators_2.11/bundles/scala-parser-combinators_2.11-1.0.2.jar,file:/Users/andrew/.ivy2/cache/jline/jline/jars/jline-2.12.jar] and parent being xsbt.boot.BootFilteredLoader@1f421ab0 of type class xsbt.boot.BootFilteredLoader with classpath [<unknown>] and parent being sun.misc.Launcher$AppClassLoader@372f2b32 of type class sun.misc.Launcher$AppClassLoader with classpath [file:/usr/local/Cellar/sbt/0.13.7/libexec/sbt-launch.jar] and parent being sun.misc.Launcher$ExtClassLoader@79bcfbeb of type class sun.misc.Launcher$ExtClassLoader with classpath [file:/Library/Java/JavaVirtualMachines/jdk1.7.0_25.jdk/Contents/Home/jre/lib/ext/dnsns.jar,file:/Library/Java/JavaVirtualMachines/jdk1.7.0_25.jdk/Contents/Home/jre/lib/ext/localedata.jar,file:/Library/Java/JavaVirtualMachines/jdk1.7.0_25.jdk/Contents/Home/jre/lib/ext/sunec.jar,file:/Library/Java/JavaVirtualMachines/jdk1.7.0_25.jdk/Contents/Home/jre/lib/ext/sunjce_provider.jar,file:/Library/Java/JavaVirtualMachines/jdk1.7.0_25.jdk/Contents/Home/jre/lib/ext/sunpkcs11.jar,file:/Library/Java/JavaVirtualMachines/jdk1.7.0_25.jdk/Contents/Home/jre/lib/ext/zipfs.jar,file:/System/Library/Java/Extensions/AppleScriptEngine.jar,file:/System/Library/Java/Extensions/dns_sd.jar,file:/System/Library/Java/Extensions/j3daudio.jar,file:/System/Library/Java/Extensions/j3dcore.jar,file:/System/Library/Java/Extensions/j3dutils.jar,file:/System/Library/Java/Extensions/jai_codec.jar,file:/System/Library/Java/Extensions/jai_core.jar,file:/System/Library/Java/Extensions/libAppleScriptEngine.jnilib,file:/System/Library/Java/Extensions/libJ3D.jnilib,file:/System/Library/Java/Extensions/libJ3DAudio.jnilib,file:/System/Library/Java/Extensions/libJ3DUtils.jnilib,file:/System/Library/Java/Extensions/libmlib_jai.jnilib,file:/System/Library/Java/Extensions/libQTJNative.jnilib,file:/System/Library/Java/Extensions/mlibwrapper_jai.jar,file:/System/Library/Java/Extensions/MRJToolkit.jar,file:/System/Library/Java/Extensions/QTJava.zip,file:/System/Library/Java/Extensions/vecmath.jar,file:/usr/lib/java/libjdns_sd.jnilib] and parent being primordial classloader with boot classpath [/Library/Java/JavaVirtualMachines/jdk1.7.0_25.jdk/Contents/Home/jre/lib/resources.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_25.jdk/Contents/Home/jre/lib/rt.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_25.jdk/Contents/Home/jre/lib/sunrsasign.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_25.jdk/Contents/Home/jre/lib/jsse.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_25.jdk/Contents/Home/jre/lib/jce.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_25.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_25.jdk/Contents/Home/jre/lib/jfr.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_25.jdk/Contents/Home/jre/lib/JObjC.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_25.jdk/Contents/Home/jre/classes] not found. at scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:123) at scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:22) at org.apache.spark.sql.catalyst.ScalaReflection$$typecreator1$1.apply(ScalaReflection.scala:115) at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe$lzycompute(TypeTags.scala:232) at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe(TypeTags.scala:232) at scala.reflect.api.TypeTags$class.typeOf(TypeTags.scala:341) at scala.reflect.api.Universe.typeOf(Universe.scala:61) at org.apache.spark.sql.catalyst.ScalaReflection$class.schemaFor(ScalaReflection.scala:115) at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:33) at org.apache.spark.sql.catalyst.ScalaReflection$class.schemaFor(ScalaReflection.scala:100) at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:33) at org.apache.spark.sql.catalyst.ScalaReflection$class.attributesFor(ScalaReflection.scala:94) at org.apache.spark.sql.catalyst.ScalaReflection$.attributesFor(ScalaReflection.scala:33) at org.apache.spark.sql.SQLContext.createSchemaRDD(SQLContext.scala:111) ... 43 elided I'm also getting an error about unread block data when trying to use the saveAsNewAPIHadoopDataset method. However, when running this same code using Intellij IDEA, I don't get this reflection error. Any idea what might be wrong? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/ScalaReflectionException-when-using-saveAsParquetFile-in-sbt-tp21020.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org