27;ll explore whether sbt is more flexible and does what's needed.
Andrew
From: Michael Armbrust [mailto:mich...@databricks.com]
Sent: 26 August 2015 03:12
To: Marcelo Vanzin
Cc: Rowson, Andrew G. (Financial&Risk) ;
dev@spark.apache.org
Subject: Re: Spark builds: allow user overri
This isn't really answering the question, but for what it is worth, I
manage several different branches of Spark and publish custom named
versions regularly to an internal repository, and this is *much* easier
with SBT than with maven. You can actually link the Spark SBT build into
an external SBT
On Tue, Aug 25, 2015 at 2:17 AM, wrote:
> Then, if I wanted to do a build against a specific profile, I could also
> pass in a -Dspark.version=1.4.1-custom-string and have the output artifacts
> correctly named. The default behaviour should be the same. Child pom files
> would need to reference $