​And the scala way of doing it would be:

val sc = new SparkContext(conf)

     sc.addJar("/full/path/to/my/application/jar/myapp.jar")


On Wed, Oct 29, 2014 at 1:44 AM, Shailesh Birari <sbir...@wynyardgroup.com>
wrote:

> Yes, this is doable.
> I am submitting the Spark job using
> JavaSparkContext spark = new JavaSparkContext(sparkMaster,
>             "app name", System.getenv("SPARK_HOME"),
>             new String[] {"application JAR"});
>
> To run this first you have to create the application jar and in above API
> specify its absolute path.
> That's all. Run your java application like any other.
>
>   Shailesh
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Submiting-Spark-application-through-code-tp17452p17553.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to