The major.minor version of the new org.spark-project.hive.hive-exec is
51.0, so it will require people use JDK7. Is it intentional?

<dependency>
<groupId>org.spark-project.hive</groupId>
<artifactId>hive-exec</artifactId>
<version>0.12.0-protobuf-2.5</version>
</dependency>

You can use the following steps to reproduce it (Need to use JDK6):

1. Create a Test.java file with the following content:

public class Test {

    public static void main(String[] args) throws Exception{
       Class.forName("org.apache.hadoop.hive.conf.HiveConf");
    }

}

2. javac Test.java
3. java -classpath
~/.m2/repository/org/spark-project/hive/hive-exec/0.12.0-protobuf-2.5/hive-exec-0.12.0-protobuf-2.5.jar:.
Test

Exception in thread "main" java.lang.UnsupportedClassVersionError:
org/apache/hadoop/hive/conf/HiveConf : Unsupported major.minor version 51.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:169)
at Test.main(Test.java:5)


Best Regards,
Shixiong Zhu

Reply via email to