At 2014-08-03 13:14:52 +0530, Deep Pradhan <pradhandeep1...@gmail.com> wrote:
> I have a single node cluster on which I have Spark running. I ran some
> graphx codes on some data set. Now when I stop all the workers in the
> cluster (sbin/stop-all.sh), the codes still run and gives the answers. Why
> is it so? I mean does graphx run even without Spark coming up?

Are you passing the correct master URL (spark://master-address:7077) to 
run-example or spark-submit? If not, Spark may be ignoring your single-node 
cluster and defaulting to local mode.

Ankur

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to