I used spark-submit to run the MovieLensALS example from the examples
module.
here is the command:

$spark-submit --master local
/home/phoenix/spark/spark-dev/examples/target/scala-2.10/spark-examples-1.0.0-SNAPSHOT-hadoop1.0.4.jar
--class org.apache.spark.examples.mllib.MovieLensALS u.data

also, you could check the parameters of spark-submit by $spark-submit --h

hope this helps!


On Wed, May 7, 2014 at 9:27 AM, Tathagata Das
<tathagata.das1...@gmail.com>wrote:

> Doesnt the run-example script work for you? Also, are you on the latest
> commit of branch-1.0 ?
>
> TD
>
>
> On Mon, May 5, 2014 at 7:51 PM, Soumya Simanta 
> <soumya.sima...@gmail.com>wrote:
>
>>
>>
>> Yes, I'm struggling with a similar problem where my class are not found
>> on the worker nodes. I'm using 1.0.0_SNAPSHOT.  I would really appreciate
>> if someone can provide some documentation on the usage of spark-submit.
>>
>> Thanks
>>
>> > On May 5, 2014, at 10:24 PM, Stephen Boesch <java...@gmail.com> wrote:
>> >
>> >
>> > I have a spark streaming application that uses the external streaming
>> modules (e.g. kafka, mqtt, ..) as well.  It is not clear how to properly
>> invoke the spark-submit script: what are the ---driver-class-path and/or
>> -Dspark.executor.extraClassPath parameters required?
>> >
>> >  For reference, the following error is proving difficult to resolve:
>> >
>> > java.lang.ClassNotFoundException:
>> org.apache.spark.streaming.examples.StreamingExamples
>> >
>>
>
>

Reply via email to