You have to change most of dependences in the spark-example model from 
"provided" to "compile", so you can run the example in Intellij.
Yong

> Date: Fri, 3 Apr 2015 09:22:13 -0700
> From: eng.sara.must...@gmail.com
> To: dev@spark.apache.org
> Subject: IntelliJ Runtime error
> 
> Hi,
> 
> I have built Spark 1.3.0 successfully on IntelliJ IDEA 14, but when i try to
> SparkPi example under the examples module i face this error:
> Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/spark/SparkConf
>       at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:27)
>       at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
> Caused by: java.lang.ClassNotFoundException: org.apache.spark.SparkConf
>       at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>       at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>       at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>       ... 7 more
> 
> Could anyone help me please?
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-developers-list.1001551.n3.nabble.com/IntelliJ-Runtime-error-tp11383.html
> Sent from the Apache Spark Developers List mailing list archive at Nabble.com.
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
> 
                                          

Reply via email to