Hi ,all:

When compiling spark1.4.0 with java1.6.0_20 (maven version: 3.2.5, scala 
vertion: 2.10.4), I always encounter the compiling error  as followed:

****/spark-1.4.0/unsafe/src/main/java/org/apache/spark/unsafe/PlatformDependent.java:149:
 copyMemory(long,long,long) in sun.misc.Unsafe cannot be applied to 
(java.lang.Object,long,java.lang.Object,long,long)
[ERROR]       _UNSAFE.copyMemory(src, srcOffset, dst, dstOffset, size);
[ERROR]              ^
[ERROR] 1 error
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM ........................... SUCCESS [  7.304 s]
[INFO] Spark Launcher Project ............................. SUCCESS [ 13.355 s]
[INFO] Spark Project Networking ........................... SUCCESS [ 11.703 s]
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [  5.817 s]
[INFO] Spark Project Unsafe ............................... FAILURE [  1.048 s]
[INFO] Spark Project Core ................................. SKIPPED
[INFO] Spark Project Bagel ................................ SKIPPED
[INFO] Spark Project GraphX ............................... SKIPPED
[INFO] Spark Project Streaming ............................ SKIPPED
[INFO] Spark Project Catalyst ............................. SKIPPED
[INFO] Spark Project SQL .................................. SKIPPED
[INFO] Spark Project ML Library ........................... SKIPPED
[INFO] Spark Project Tools ................................ SKIPPED
[INFO] Spark Project Hive ................................. SKIPPED
[INFO] Spark Project REPL ................................. SKIPPED
[INFO] Spark Project YARN ................................. SKIPPED
[INFO] Spark Project Assembly ............................. SKIPPED
[INFO] Spark Project External Twitter ..................... SKIPPED
[INFO] Spark Project External Flume Sink .................. SKIPPED
[INFO] Spark Project External Flume ....................... SKIPPED
[INFO] Spark Project External MQTT ........................ SKIPPED
[INFO] Spark Project External ZeroMQ ...................... SKIPPED
[INFO] Spark Project External Kafka ....................... SKIPPED
[INFO] Spark Project Examples ............................. SKIPPED
[INFO] Spark Project External Kafka Assembly .............. SKIPPED
[INFO] Spark Project YARN Shuffle Service ................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 40.705 s
[INFO] Finished at: 2015-06-26T09:15:33+08:00
[INFO] Final Memory: 74M/838M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal 
net.alchim31.maven:scala-maven-plugin:3.2.1:compile (scala-compile-first) on 
project spark-unsafe_2.10: Execution scala-compile-first of goal 
net.alchim31.maven:scala-maven-plugin:3.2.1:compile failed. CompileFailed -> 
[Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :spark-unsafe_2.10

According to the error info “/PlatformDependent.java:149: 
copyMemory(long,long,long) in sun.misc.Unsafe cannot be applied to 
(java.lang.Object,long,java.lang.Object,long,long)”,I find this in 
PlatformDependent.java:
141   static public void copyMemory(
142       Object src,
143       long srcOffset,
144       Object dst,
145       long dstOffset,
146       long length) {
147     while (length > 0) {
148       long size = Math.min(length, UNSAFE_COPY_THRESHOLD);
149       _UNSAFE.copyMemory(src, srcOffset, dst, dstOffset, size);
150       length -= size;
151       srcOffset += size;
152       dstOffset += size;
153     }
154   }

It seems that copyMemory (java.lang.Object,long,java.lang.Object,long,long) is 
introduced since Java7? But According to the official document, "Spark runs on 
Java 6+, Python 2.6+ and R 3.1+. For the Scala API, Spark 1.4.0 uses Scala 
2.10. You will need to use a compatible Scala version (2.10.x)."

So, how can I compile spark1.4.0 successfully with java1.6.0_20?  Does anybody 
have the same error? Thank you very much for help:)

Sincerely,
Young

Reply via email to