/us-wtan
From: Madhu
To: u...@spark.incubator.apache.org,
Date: 06/07/2014 05:21 PM
Subject:Re: best practice: write and debug Spark application in
scala-ide and maven
For debugging, I run locally inside Eclipse without maven.
I just add the Spark assembly jar to my Eclipse
ies in your code, so that should
be sufficient for your example.
-
Madhu
https://www.linkedin.com/in/msiddalingaiah
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/best-practice-write-and-debug-Spark-application-in-scala-ide-and-maven-tp7151p7183.html
Sounds like there's two questions here:
First, from the command line, if you "mvn package" and then run the
code with "java -cp targe/*jar-with-dependencies.jar com.ibm.App" do
you still get the error?
Second, for quick debugging, I agree that it's a pain to wait for mvn
package to finish every t
I think that you have two options:
- to run your code locally, you can use local mode by using the 'local'
master like so:
new SparkConf().setMaster("local[4]") where 4 is the number of cores
assigned to the local mode.
- to run your code remotely you need to build the jar with dependencies and
Hi,
I am trying to write and debug Spark applications in scala-ide and
maven, and in my code I target at a Spark instance at spark://xxx
object App {
def main(args : Array[String]) {
println( "Hello World!" )
val sparkConf = new
SparkConf().setMaster("spark://xxx:7077").setAppNa