Hi, 

I have noticed that the GroupByTest example in
https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/GroupByTest.scala
has been changed to  be run using spark-submit. 
Previously,  I set "local" as the first command line parameter, and this enable 
me to run GroupByTest in Eclipse. 
val sc = new SparkContext(args(0), "GroupBy Test",
System.getenv("SPARK_HOME"), SparkContext.jarOfClass(this.getClass).toSeq)


In the latest GroupByTest code, I can not pass in "local" as the first comand 
line parameter : 
val sparkConf = new SparkConf().setAppName("GroupBy Test")
var numMappers = if (args.length > 0) args(0).toInt else 2
var numKVPairs = if (args.length > 1) args(1).toInt else 1000
var valSize = if (args.length > 2) args(2).toInt else 1000
var numReducers = if (args.length > 3) args(3).toInt else numMappers
val sc = new SparkContext(sparkConf)


Is there a way to specify  "master=local" (maybe in an environment variable), 
so that I can run the latest 
version of GroupByTest in Eclipse without changing the code. 

Thanks in advance for your assistance !

Shing 

Reply via email to