Dear all, 

 I use cmd "*./build/mvn -Pyarn -Phadoop-2.7 -Dhadoop.version=2.7.6
-DskipTests clean package*" to
compile Spark2.4, but failed on Spark Project Tags, which throws error:

*Cannot run program
"/Library/Java/JavaVirtualMachines/jdk1.8.0_131.jdk/Contents/Home/jre/bin/javac":
error=2, No such file or directory*

And I double check my
JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_131.jdk/Contents/Home,
and
cmd "java -version" can be executed successfully, also for "javac -help".


Here's some compile info:

INFO] Using zinc server for incremental compilation
[INFO] Toolchain in scala-maven-plugin:
*/Library/Java/JavaVirtualMachines/jdk1.8.0_131.jdk/Contents/Home/jre*
[warn] Pruning sources from previous analysis, due to incompatible
CompileSetup.
[info] Compiling 2 Scala sources and 8 Java sources to
/Users/wuyi/workspace/spark/common/tags/target/scala-2.12/classes...
[error] Cannot run program
"/Library/Java/JavaVirtualMachines/jdk1.8.0_131.jdk/Contents/Home/jre/bin/javac":
error=2, No such file or directory


I guess maven should find *javac* under directory *$JAVA_HOME/bin*, but why
it goes to *$JAVA_HOME/jre/bin * ?

I'd appreciate a lot if any devs could give me a hint. THANKS. 

Best wishes. 
wuyi




--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to