On a related note there is also a staging Apache repository where the
latest rc gets pushed to
https://repository.apache.org/content/repositories/staging/org/apache/spark/spark-core_2.10/--
The artifact here is just named "1.0.0" (similar to the rc specific
repository that Patrick mentioned). So i
First time to know there is a temporary maven repository…….
--
Nan Zhu
On Monday, May 19, 2014 at 10:10 PM, Patrick Wendell wrote:
> Whenever we publish a release candidate, we create a temporary maven
> repository that host the artifacts. We do this precisely for the case
> you are running i
Thanks everyone. I followed Patrick's suggestion and it worked like a charm.
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/spark-1-0-standalone-application-tp6698p6710.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com
Threads like these are great candidates to be part of the "Contributors
guide". I will create a JIRA to update the guide with data past threads
like these.
Sujeet
On Mon, May 19, 2014 at 7:10 PM, Patrick Wendell wrote:
> Whenever we publish a release candidate, we create a temporary maven
> re
Whenever we publish a release candidate, we create a temporary maven
repository that host the artifacts. We do this precisely for the case
you are running into (where a user wants to build an application
against it to test).
You can build against the release candidate by just adding that
repositor
That's the crude way to do it. If you run `sbt/sbt publishLocal`, then you
can resolve the artifact from your local cache in the same way that you
would resolve it if it were deployed to a remote cache. That's just the
build step. Actually running the application will require the necessary
jars
en, you have to put spark-assembly-*.jar to the lib directory of your
application
Best,
--
Nan Zhu
On Monday, May 19, 2014 at 9:48 PM, nit wrote:
> I am not much comfortable with sbt. I want to build a standalone application
> using spark 1.0 RC9. I can build sbt assembly for my applicatio
I am not much comfortable with sbt. I want to build a standalone application
using spark 1.0 RC9. I can build sbt assembly for my application with Spark
0.9.1, and I think in that case spark is pulled from Aka Repository?
Now if I want to use 1.0 RC9 for my application; what is the process ?
(FYI