Hi all,
I got one problem when building spark.
I am using maven 3.1.1, spark 1.0.1, scala 2.10.1, Hadoop 1.2.1 (OS: Ubuntu 
12.0.4)
I first download the binary package and unzip in a directory called 
"/home/hduser/spark".
Then I do the following:
$ cd  /home/hduser/spark
$ export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m"
$ mvn -Dhadoop.version=1.2.1 -DskipTests clean package
However, I saw the error:
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.2.1:exec 
(default) on project spark-core_2.10: Command execution failed. Cannot run 
program "unzip" (in directory "/home/hduser/downloads/spark-1.0.1/python"): 
error=2, No such file or directory -> [Help 1]
Is there anything wrong?
Thanks in advance.
Jack


Reply via email to