adClass(Launcher.java:308) at
java.lang.ClassLoader.loadClass(ClassLoader.java:358) ... 12 more
I am gonna keep working to solve this. Meanwhile if u can provide some guidance
that would be cool
sanjay From: Daniel Siegmann
To: Ashish Jain
Cc: Sanjay Subramanian ; "user@spark.apache.o
; *Sent:* Thursday, October 2, 2014 6:52 AM
> *Subject:* Re: Spark inside Eclipse
>
> You don't need to do anything special to run in local mode from within
> Eclipse. Just create a simple SparkConf and create a SparkContext from
> that. I have unit tests which execute on a local S
cool thanks will set this up and report back how things wentregardssanjay
From: Daniel Siegmann
To: Ashish Jain
Cc: Sanjay Subramanian ; "user@spark.apache.org"
Sent: Thursday, October 2, 2014 6:52 AM
Subject: Re: Spark inside Eclipse
You don't need to do anything
You don't need to do anything special to run in local mode from within
Eclipse. Just create a simple SparkConf and create a SparkContext from
that. I have unit tests which execute on a local SparkContext, and they
work from inside Eclipse as well as SBT.
val conf = new SparkConf().setMaster("local
Hello Sanjay,
This can be done, and is a very effective way to debug.
1) Compile and package your project to get a fat jar
2) In your SparkConf use setJars and give location of this jar. Also set
your master here as local in SparkConf
3) Use this SparkConf when creating JavaSparkContext
4) Debug
Cycling bits:
http://search-hadoop.com/m/JW1q5wxkXH/spark+eclipse&subj=Buidling+spark+in+Eclipse+Kepler
On Wed, Oct 1, 2014 at 4:35 PM, Sanjay Subramanian <
sanjaysubraman...@yahoo.com.invalid> wrote:
> hey guys
>
> Is there a way to run Spark in local mode from within Eclipse.
> I am running Ecl