Hi all,

I'm trying to upgrade some Spark RPMs from 1.1.0 to 1.2.0.  As part of the
RPM process, we build Spark with Maven. With Spark 1.2.0, though, the
artifacts are  placed in com/google/guava and there is no org/apache/spark.

I saw that the pom.xml files had been modified to prevent the install
command and that the guava dependency was modified.  Could someone who is
more familiar with the Spark maven files comment on what might be causing
this oddity?

Thanks,
RJ

We build Spark like so:
$ mvn -Phadoop-2.4 -Dmesos.version=0.20.0 -DskipTests clean package

Then build a local Maven repo:

mvn -Phadoop-2.4 \
    -Dmesos.version=0.20.0 \
    -DskipTests install:install-file  \
    -Dfile=assembly/target/scala-2.10/spark-assembly-1.2.0-hadoop2.4.0.jar \
    -DcreateChecksum=true \
    -DgeneratePom=true \
    -DartifactId=spark-assembly_2.1.0 \
    -DlocalRepositoryPath=../maven2

Reply via email to