You have marked Spark dependencies as 'provided', but are evidently
not 'providing' them at runtime. You haven't said how you are running
them. Running with spark-submit should set up the classpath correctly.

On Sun, Aug 3, 2014 at 12:47 PM, Mahebub Sayyed <mahebub...@gmail.com> wrote:
> Hello,
>
> I am getting following error while running kafka-spark-example:
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/spark/api/java/function/Function
> at java.lang.Class.getDeclaredMethods0(Native Method)
> at java.lang.Class.privateGetDeclaredMethods(Class.java:2531)
> at java.lang.Class.getMethod0(Class.java:2774)
> at java.lang.Class.getMethod(Class.java:1663)
> at sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)
> at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.spark.api.java.function.Function
> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> ... 6 more
>
> My pom.xml:
>
> <dependency>
> <groupId>org.apache.spark</groupId>
> <artifactId>spark-core_2.10</artifactId>
> <version>1.0.0</version>
> <scope>provided</scope>
> </dependency>
> <dependency>
>      <groupId>org.apache.spark</groupId>
>      <artifactId>spark-streaming_2.10</artifactId>
>      <version>1.0.0</version>
>      <scope>provided</scope>
>     </dependency>
> <dependency>
> <groupId>org.apache.spark</groupId>
> <artifactId>spark-streaming-kafka_2.10</artifactId>
> <version>1.0.1</version>
> </dependency>
>
> Please help me.
>
> --
> Regards,
> Mahebub Sayyed

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to