You don't need to do anything special to run in local mode from within Eclipse. Just create a simple SparkConf and create a SparkContext from that. I have unit tests which execute on a local SparkContext, and they work from inside Eclipse as well as SBT.
val conf = new SparkConf().setMaster("local").setAppName(s"Whatever") val sc = new SparkContext(sparkConf) Keep in mind you can only have one local SparkContext at a time, otherwise you will get some weird errors. If you have tests running sequentially, make sure to close the SparkContext in your tear down method. If tests run in parallel you'll need to share the SparkContext between tests. For unit testing, you can make use of SparkContext.parallelize to set up your test inputs and RDD.collect to retrieve the outputs. On Wed, Oct 1, 2014 at 7:43 PM, Ashish Jain <ashish....@gmail.com> wrote: > Hello Sanjay, > > This can be done, and is a very effective way to debug. > > 1) Compile and package your project to get a fat jar > 2) In your SparkConf use setJars and give location of this jar. Also set > your master here as local in SparkConf > 3) Use this SparkConf when creating JavaSparkContext > 4) Debug your program like you would any normal program. > > Hope this helps. > > Thanks > Ashish > On Oct 1, 2014 4:35 PM, "Sanjay Subramanian" > <sanjaysubraman...@yahoo.com.invalid> wrote: > >> hey guys >> >> Is there a way to run Spark in local mode from within Eclipse. >> I am running Eclipse Kepler on a Macbook Pro with Mavericks >> Like one can run hadoop map/reduce applications from within Eclipse and >> debug and learn. >> >> thanks >> >> sanjay >> > -- Daniel Siegmann, Software Developer Velos Accelerating Machine Learning 440 NINTH AVENUE, 11TH FLOOR, NEW YORK, NY 10001 E: daniel.siegm...@velos.io W: www.velos.io