(This also works for me on a fresh VM, fresh pull, etc.)

On Tue, Jan 13, 2015 at 10:03 AM, Robin East <robin.e...@xense.co.uk> wrote:
> I’ve just pulled down the latest commits from github, and done the
> following:
>
> 1)
> mvn clean package -DskipTests
>
> builds fine
>
> 2)
> ./bin/spark-shell works
>
> 3)
> run SparkPi example with no problems:
>
> ./bin/run-example SparkPi 10
>
> 4)
> Started a master
>
> ./sbin/start-master.sh
>
> grabbed the MasterWebUI from the master log - Started MasterWebUI at
> http://x.x.x.x:8080
>
> Can view the MasterWebUI from local browser
>
> 5)
> grabbed the spark url from the master log and started a local slave:
>
> ./bin/spark-class org.apache.spark.deploy.worker.Worker
> spark://<hostname>:7077 &
>
> 6)
> Ran jps to confirm both Master and Worker processes are present.
>
> 7)
> Ran SparkPi on the mini-cluster:
>
> MASTER=spark://<host>:7077 ./bin/run-example SparkPi 10
>
> All worked fine, can see information in the MasterWebUI
>
> Which of these stops doesn’t work for you? I presume you’ve tried re-pulling
> from git and a clean build again.
>
> Robin
> On 13 Jan 2015, at 08:07, Ganon Pierce <ganon.pie...@me.com> wrote:
>
> After clean build still receiving the same error.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to