Hello Sanjay,

This can be done, and is a very effective way to debug.

1) Compile and package your project to get a fat jar
2) In your SparkConf use setJars and give location of this jar. Also set
your master here as local in SparkConf
3) Use this SparkConf when creating JavaSparkContext
4) Debug your program like you would any normal program.

Hope this helps.

Thanks
Ashish
On Oct 1, 2014 4:35 PM, "Sanjay Subramanian"
<sanjaysubraman...@yahoo.com.invalid> wrote:

> hey guys
>
> Is there a way to run Spark in local mode from within Eclipse.
> I am running Eclipse Kepler on a Macbook Pro with Mavericks
> Like one can run hadoop map/reduce applications from within Eclipse and
> debug and learn.
>
> thanks
>
> sanjay
>

Reply via email to