There is not yet a 1.2.0 branch; there is no 1.2.0 release. master is
1.2.0-SNAPSHOT, not 1.2.0. Your final command is correct, but it's
redundant to 'package' and then throw that away with another 'clean'.
Just the final command with '... clean install' is needed.

On Thu, Oct 9, 2014 at 2:12 AM, Arun Luthra <arun.lut...@gmail.com> wrote:
> Hi Pat,
>
> Couple of points:
>
> 1) I must have done something naive like:
> git clone git://github.com/apache/spark.git -b branch-1.2.0
>
> because "git branch" is telling me I'm on the "master" branch, and I see
> that branch-1.2.0 doesn't exist (https://github.com/apache/spark).
> Nevertheless, when I compiled this master branch spark shell tells me I have
> 1.2.0. So I guess the master is currently 1.2.0...
>
> 2) The README on the master branch only has build instructions for maven. I
> built Spark successfully with
> mvn -Phadoop-2.2 -Dhadoop.version=2.2.0 -DskipTests clean package
>
> and it looks like the publish local solution for maven is:
> mvn -Phadoop-2.2 -Dhadoop.version=2.2.0 -DskipTests clean install
>
> I will report back with the result.
>
> On Wed, Oct 8, 2014 at 5:50 PM, Pat McDonough <pat.mcdono...@databricks.com>
> wrote:
>>
>> Hey Arun,
>>
>> Since this build depends on unpublished builds of spark (1.2.0-SNAPSHOT),
>> you'll need to first build spark and "publish-local" so your application
>> build can find those SNAPSHOTs in your local repo.
>>
>> Just append "publish-local" to your sbt command where you build Spark.
>>
>> -Pat

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to