I tried to build Spark 1.4.1 on cdh 5.4.0. Because we need to support
PySpark, I used JDK 1.6.

I got the following error,

[INFO] --- scala-maven-plugin:3.2.0:testCompile (scala-test-compile-first)
@ spark-streaming_2.10 ---

java.lang.UnsupportedClassVersionError: org/apache/hadoop/io/LongWritable :
Unsupported major.minor version 51.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClassCond(ClassLoader.java:637)
at java.lang.ClassLoader.defineClass(ClassLoader.java:621)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)

I know that is due to the hadoop jar for cdh5.4.0 is built with JDK 7.
Anyone has done this before?

Thanks,

-- 
Chen Song

Reply via email to