We need to pass the URL only when we are using the interactive shell right?
Now, I am not using the interactive shell, I am just doing
./bin/run-example.......... when I am in the Spark directory.
>>If not, Spark may be ignoring your single-node cluster and defaulting to
local mode.
What does this mean? I can work even without Spark coming up? Does the same
thing happen even if I have a multi-node cluster?
Thank You


On Sun, Aug 3, 2014 at 2:24 PM, Ankur Dave <ankurd...@gmail.com> wrote:

> At 2014-08-03 13:14:52 +0530, Deep Pradhan <pradhandeep1...@gmail.com>
> wrote:
> > I have a single node cluster on which I have Spark running. I ran some
> > graphx codes on some data set. Now when I stop all the workers in the
> > cluster (sbin/stop-all.sh), the codes still run and gives the answers.
> Why
> > is it so? I mean does graphx run even without Spark coming up?
>
> Are you passing the correct master URL (spark://master-address:7077) to
> run-example or spark-submit? If not, Spark may be ignoring your single-node
> cluster and defaulting to local mode.
>
> Ankur
>

Reply via email to