Hey AJ,

If you plan to launch your job on a cluster, consider using the spark-submit 
command.
Running this in the spark's home directory gives you a help on how to use this:

$ ./bin/spark-submit

I haven't tried it yet but considering this post, it will be the preferred way 
to launch jobs:
http://apache-spark-user-list.1001560.n3.nabble.com/Running-a-spark-submit-compatible-app-in-spark-shell-td4905.html

Cheers

Le 2014-05-04 à 13:35, Ajay Nair <prodig...@gmail.com> a écrit :

> Thank you. I am trying this now
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-developers-list.1001551.n3.nabble.com/Apache-Spark-running-out-of-the-spark-shell-tp6459p6472.html
> Sent from the Apache Spark Developers List mailing list archive at Nabble.com.
> 

Nicolas Garneau
ngarn...@ngarneau.com

Reply via email to