Getting to the Spark web UI when Spark is running on Dataproc is not
that straightforward. Connecting to that web interface is a two step
process:

1. create an SSH tunnel
2. configure the browser to use a SOCKS proxy to connect

The above steps are described here:
https://cloud.google.com/dataproc/docs/concepts/cluster-web-interfaces

Once you have your browser configured and running, go to the
http://<master-node-name>:4040 for the Spark web UI and
http://<master-node-name>:18080 for Spark's history server.

<master-node-name> is the name of the cluster with "-m" appendage. So,
if the cluster name is "mycluster", master will be called
"mycluster-m".

Cheers,
Dinko

On 7 February 2017 at 21:41, Jacek Laskowski <ja...@japila.pl> wrote:
> Hi,
>
> I know nothing about Spark in GCP so answering this for a pure Spark.
>
> Can you use web UI and Executors tab or a SparkListener?
>
> Jacek
>
> On 7 Feb 2017 5:33 p.m., "Anahita Talebi" <anahita.t.am...@gmail.com> wrote:
>
> Hello Friends,
>
> I am trying to run a spark code on multiple machines. To this aim, I submit
> a spark code on submit job on google cloud platform.
> https://cloud.google.com/dataproc/docs/guides/submit-job
>
> I have created a cluster with 6 nodes. Does anyone know how I can realize
> which nodes are participated when I run the code on the cluster?
>
> Thanks a lot,
> Anahita
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to