That's the crude way to do it.  If you run `sbt/sbt publishLocal`, then you
can resolve the artifact from your local cache in the same way that you
would resolve it if it were deployed to a remote cache.  That's just the
build step.  Actually running the application will require the necessary
jars to be accessible by the cluster nodes.


On Mon, May 19, 2014 at 7:04 PM, Nan Zhu <zhunanmcg...@gmail.com> wrote:

> en, you have to put spark-assembly-*.jar to the lib directory of your
> application
>
> Best,
>
> --
> Nan Zhu
>
>
> On Monday, May 19, 2014 at 9:48 PM, nit wrote:
>
> > I am not much comfortable with sbt. I want to build a standalone
> application
> > using spark 1.0 RC9. I can build sbt assembly for my application with
> Spark
> > 0.9.1, and I think in that case spark is pulled from Aka Repository?
> >
> > Now if I want to use 1.0 RC9 for my application; what is the process ?
> > (FYI, I was able to build spark-1.0 via sbt/assembly and I can see
> > sbt-assembly jar; and I think I will have to copy my jar somewhere? and
> > update build.sbt?)
> >
> > PS: I am not sure if this is the right place for this question; but since
> > 1.0 is still RC, I felt that this may be appropriate forum.
> >
> > thank!
> >
> >
> >
> > --
> > View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/spark-1-0-standalone-application-tp6698.html
> > Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com (http://Nabble.com).
> >
> >
>
>
>

Reply via email to