/us-wtan
From: Madhu
To: u...@spark.incubator.apache.org,
Date: 06/07/2014 05:21 PM
Subject:Re: best practice: write and debug Spark application in
scala-ide and maven
For debugging, I run locally inside Eclipse without maven.
I just add the Spark assembly jar to my Eclipse
For debugging, I run locally inside Eclipse without maven.
I just add the Spark assembly jar to my Eclipse project build path and click
'Run As... Scala Application'.
I have done the same with Java and Scala Test, it's quick and easy.
I didn't see any third party jar dependencies in your code, so t
Sounds like there's two questions here:
First, from the command line, if you "mvn package" and then run the
code with "java -cp targe/*jar-with-dependencies.jar com.ibm.App" do
you still get the error?
Second, for quick debugging, I agree that it's a pain to wait for mvn
package to finish every t
I think that you have two options:
- to run your code locally, you can use local mode by using the 'local'
master like so:
new SparkConf().setMaster("local[4]") where 4 is the number of cores
assigned to the local mode.
- to run your code remotely you need to build the jar with dependencies and