Hi Raghav,

If you want to make changes to Spark and run your application with it, you
may follow these steps.

1. git clone g...@github.com:apache/spark
2. cd spark; build/mvn clean package -DskipTests [...]
3. make local changes
4. build/mvn package -DskipTests [...] (no need to clean again here)
5. bin/spark-submit --master spark://[...] --class your.main.class your.jar

No need to pass in extra --driver-java-options or --driver-extra-classpath
as others have suggested. When using spark-submit, the main jar comes from
assembly/target/scala_2.10, which is prepared through "mvn package". You
just have to make sure that you re-package the assembly jar after each
modification.

-Andrew

2015-06-18 16:35 GMT-07:00 maxdml <max...@cs.duke.edu>:

> You can specify the jars of your application to be included with
> spark-submit
> with the /--jars/ switch.
>
> Otherwise, are you sure that your newly compiled spark jar assembly is in
> assembly/target/scala-2.10/?
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Submitting-Spark-Applications-using-Spark-Submit-tp23352p23400.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to