Re: How to configure build.sbt for Spark 1.2.0

2014-10-08 Thread Sean Owen
There is not yet a 1.2.0 branch; there is no 1.2.0 release. master is 1.2.0-SNAPSHOT, not 1.2.0. Your final command is correct, but it's redundant to 'package' and then throw that away with another 'clean'. Just the final command with '... clean install' is needed. On Thu, Oct 9, 2014 at 2:12 AM,

Re: How to configure build.sbt for Spark 1.2.0

2014-10-08 Thread Arun Luthra
Hi Pat, Couple of points: 1) I must have done something naive like: git clone git://github.com/apache/spark.git -b branch-1.2.0 because "git branch" is telling me I'm on the "master" branch, and I see that branch-1.2.0 doesn't exist (https://github.com/apache/spark). Nevertheless, when I compile

Re: How to configure build.sbt for Spark 1.2.0

2014-10-08 Thread Pat McDonough
Hey Arun, Since this build depends on unpublished builds of spark (1.2.0-SNAPSHOT), you'll need to first build spark and "publish-local" so your application build can find those SNAPSHOTs in your local repo. Just append "publish-local" to your sbt command where you build Spark. -Pat On Wed, O

How to configure build.sbt for Spark 1.2.0

2014-10-08 Thread Arun Luthra
I built Spark 1.2.0 succesfully, but was unable to build my Spark program under 1.2.0 with sbt assembly & my build.sbt file. It contains: I tried: "org.apache.spark" %% "spark-sql" % "1.2.0", "org.apache.spark" %% "spark-core" % "1.2.0", and "org.apache.spark" %% "spark-sql" % "1.2.0