Spark 1.5.2. depends on slf4j 1.7.10

Looks like there was another version of slf4j on the classpath.

FYI

On Mon, Jan 25, 2016 at 12:19 AM, kevin <kiss.kevin...@gmail.com> wrote:

> HI,all
>     I need to test hive on spark ,to use spark as the hive's execute
> engine.
>     I download the spark source 1.5.2 from apache web-site.
>     I have installed maven3.3.9 and scala 2.10.6 ,so I change
> the /make-distribution.sh
>     to point to my mvn location where I installed.
>
>     then I run the commond:
> ./make-distribution.sh --name "hadoop2-without-hive" --tgz
> "-Pyarn,hadoop-2.7,hadoop-provided,parquet-provided" -DskipTests
> -Dhadoop.version=2.7.1
>
>     is this all right? when I star the spark cluster ,I got error :
>
>     Spark Command: /usr/lib/jdk/bin/java -cp
> /dcos/spark/sbin/../conf/:/dcos/spark/lib/spark-assembly-1.5.2-hadoop2.7.1.jar:/dcos/hadoop/etc/hadoop/
> -Xms
> 1g -Xmx1g -XX:MaxPermSize=256m org.apache.spark.deploy.master.Master --ip
> 10.1.3.107 --port 7077 --webui-port 8080
> ========================================
> Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger
> at java.lang.Class.getDeclaredMethods0(Native Method)
> at java.lang.Class.privateGetDeclaredMethods(Class.java:2615)
> at java.lang.Class.getMethod0(Class.java:2856)
> at java.lang.Class.getMethod(Class.java:1668)
> at sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)
> at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)
> Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> ... 6 more
>
>
> I NEED some advise.
>
>

Reply via email to