Re: How to use spark-submit

2014-05-16 Thread Andrew Or
What kind of cluster mode are you running on? You may need to specify the jar through --jar, though we're working on making spark-submit automatically add the provided jar on the class path so we don't run into ClassNotFoundException as you have. What is the command that you ran? On Tue, May 6,

Re: How to use spark-submit

2014-05-14 Thread phoenix bai
I used spark-submit to run the MovieLensALS example from the examples module. here is the command: $spark-submit --master local /home/phoenix/spark/spark-dev/examples/target/scala-2.10/spark-examples-1.0.0-SNAPSHOT-hadoop1.0.4.jar --class org.apache.spark.examples.mllib.MovieLensALS u.data also,

Re: How to use spark-submit

2014-05-13 Thread Sonal Goyal
Hi Stephen, Sorry I just use plain mvn. Best Regards, Sonal Nube Technologies On Mon, May 12, 2014 at 12:29 PM, Stephen Boesch wrote: > @Sonal - makes sense. Is the maven shade plugin runnable within sbt ? If > so would you c

Re: How to use spark-submit

2014-05-12 Thread Sonal Goyal
I am creating a jar with only my dependencies and run spark-submit through my project mvn build. I have configured the mvn exec goal to the location of the script. Here is how I have set it up for my app. The mainClass is my driver program, and I am able to send my custom args too. Hope this helps.

Re: How to use spark-submit

2014-05-12 Thread Stephen Boesch
@Sonal - makes sense. Is the maven shade plugin runnable within sbt ? If so would you care to share those build.sbt (or .scala) lines? If not, are you aware of a similar plugin for sbt? 2014-05-11 23:53 GMT-07:00 Sonal Goyal : > Hi Stephen, > > I am using maven shade plugin for creating my u

Re: How to use spark-submit

2014-05-11 Thread Sonal Goyal
Hi Stephen, I am using maven shade plugin for creating my uber jar. I have marked spark dependencies as provided. Best Regards, Sonal Nube Technologies On Mon, May 12, 2014 at 1:04 AM, Stephen Boesch wrote: > HI Sonal, > Y

Re: How to use spark-submit

2014-05-11 Thread Stephen Boesch
HI Sonal, Yes I am working towards that same idea. How did you go about creating the non-spark-jar dependencies ? The way I am doing it is a separate straw-man project that does not include spark but has the external third party jars included. Then running sbt compile:managedClasspath and rev

Re: How to use spark-submit

2014-05-11 Thread Stephen Boesch
Just discovered sbt-pack: that addresses (quite well) the last item for identifying and packaging the external jars. 2014-05-11 12:34 GMT-07:00 Stephen Boesch : > HI Sonal, > Yes I am working towards that same idea. How did you go about > creating the non-spark-jar dependencies ? The way I

Re: How to use spark-submit

2014-05-11 Thread Soumya Simanta
Will sbt-pack and the maven solution work for the Scala REPL? I need the REPL because it save a lot of time when I'm playing with large data sets because I load then once, cache them and then try out things interactively before putting in a standalone driver. I've sbt woking for my own drive

Re: How to use spark-submit

2014-05-07 Thread Tathagata Das
Doesnt the run-example script work for you? Also, are you on the latest commit of branch-1.0 ? TD On Mon, May 5, 2014 at 7:51 PM, Soumya Simanta wrote: > > > Yes, I'm struggling with a similar problem where my class are not found on > the worker nodes. I'm using 1.0.0_SNAPSHOT. I would really

Re: How to use spark-submit

2014-05-05 Thread Soumya Simanta
Yes, I'm struggling with a similar problem where my class are not found on the worker nodes. I'm using 1.0.0_SNAPSHOT. I would really appreciate if someone can provide some documentation on the usage of spark-submit. Thanks > On May 5, 2014, at 10:24 PM, Stephen Boesch wrote: > > > I ha

How to use spark-submit

2014-05-05 Thread Stephen Boesch
I have a spark streaming application that uses the external streaming modules (e.g. kafka, mqtt, ..) as well. It is not clear how to properly invoke the spark-submit script: what are the ---driver-class-path and/or -Dspark.executor.extraClassPath parameters required? For reference, the following