Hm, why do you expect a factory method over a constructor? no, you
instantiate a SparkContext (if not working in the shell).
When you write your own program, you parse your own command line args.
--master yarn-client doesn't do anything unless you make it do so.
That is an arg to *Spark* programs.
Thanks guys, this is very useful :)
@Stephen, I know spark-shell will create a SC for me. But I don't
understand why we still need to do "new SparkContext(...)" in our code.
Shouldn't we get it from some where? e.g. "SparkContext.get".
Another question, if I want my spark code to run in YARN late
use the spark-shell command and the shell will open
type :paste abd then paste your code, after control-d
open spark-shell:
sparks/bin
./spark-shell
Verstuurd vanaf mijn iPhone
> Op 6-mrt.-2015 om 02:28 heeft "fightf...@163.com" het
> volgende geschreven:
>
> Hi,
>
> You can first establish
Hi,
You can first establish a scala ide to develop and debug your spark program,
lets say, intellij idea or eclipse.
Thanks,
Sun.
fightf...@163.com
From: Xi Shen
Date: 2015-03-06 09:19
To: user@spark.apache.org
Subject: Spark code development practice
Hi,
I am new to Spark. I see every spa
Hi Xi,
Yes,
You can do the following:
val sc = new SparkContext("local[2]", "mptest")
// or .. val sc = new SparkContext("spark://master:7070", "mptest")
val fileDataRdd = sc.textFile("/path/to/dir")
val fileLines = fileDataRdd.take(100)
The key here - i.e. the answer to your specific que