Is it possible to run across cluster using Spark Interactive Shell ?

To be more explicit, is the procedure similar to running standalone
master-slave spark.

I want to execute my code in  the interactive shell in the master-node, and
it should run across the cluster [say 5 node]. Is the procedure similar ???





-- 
*Sai Prasanna. AN*
*II M.Tech (CS), SSSIHL*


*Entire water in the ocean can never sink a ship, Unless it gets inside.All
the pressures of life can never hurt you, Unless you let them in.*

Reply via email to