Unfortunately we don't have anything to do with Spark on GCE, so I'd suggest
asking in the GCE support forum. You could also try to launch a Spark cluster
by hand on nodes in there. Sigmoid Analytics published a package for this here:
http://spark-packages.org/package/9
Matei
> On Jan 17, 2015
I'm deploying Spark using the "Click to Deploy" Hadoop -> "Install Apache
Spark" on Google Compute Engine.
I can run Spark jobs on the REPL and read data from Google storage.
However, I'm not sure how to access the Spark UI in this deployment. Can
anyone help?
Also, it deploys Spark 1.1. It there