Ah never mind. The 0.0.0.0 is for the UI, not for Master, which uses the
output of the "hostname" command. But yes, long answer short, go to the web
UI and use that URL.


2014-06-23 11:13 GMT-07:00 Andrew Or <and...@databricks.com>:

> Hm, spark://localhost:7077 should work, because the standalone master
> binds to 0.0.0.0. Are you sure you ran `sbin/start-master.sh`?
>
>
> 2014-06-22 22:50 GMT-07:00 Akhil Das <ak...@sigmoidanalytics.com>:
>
> Open your webUI in the browser and see the spark url in the top left
>> corner of the page and use it while starting your spark shell instead of
>> localhost:7077.
>>
>> Thanks
>> Best Regards
>>
>>
>> On Mon, Jun 23, 2014 at 10:56 AM, rapelly kartheek <
>> kartheek.m...@gmail.com> wrote:
>>
>>> Hi
>>>   Can someone help me with the following error that I faced while
>>> setting up single node spark framework.
>>>
>>> karthik@karthik-OptiPlex-9020:~/spark-1.0.0$
>>> MASTER=spark://localhost:7077 sbin/spark-shell
>>> bash: sbin/spark-shell: No such file or directory
>>> karthik@karthik-OptiPlex-9020:~/spark-1.0.0$
>>> MASTER=spark://localhost:7077 bin/spark-shell
>>> Java HotSpot(TM) 64-Bit Server VM warning: ignoring option
>>> MaxPermSize=128m; support was removed in 8.0
>>> 14/06/23 10:44:53 INFO spark.SecurityManager: Changing view acls to:
>>> karthik
>>> 14/06/23 10:44:53 INFO spark.SecurityManager: SecurityManager:
>>> authentication disabled; ui acls disabled; users with view permissions:
>>> Set(karthik)
>>> 14/06/23 10:44:53 INFO spark.HttpServer: Starting HTTP Server
>>> 14/06/23 10:44:53 INFO server.Server: jetty-8.y.z-SNAPSHOT
>>> 14/06/23 10:44:53 INFO server.AbstractConnector: Started
>>> SocketConnector@0.0.0.0:39588
>>> Welcome to
>>>       ____              __
>>>      / __/__  ___ _____/ /__
>>>     _\ \/ _ \/ _ `/ __/  '_/
>>>    /___/ .__/\_,_/_/ /_/\_\   version 1.0.0
>>>       /_/
>>>
>>> Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java
>>> 1.8.0_05)
>>> Type in expressions to have them evaluated.
>>> Type :help for more information.
>>> 14/06/23 10:44:55 INFO spark.SecurityManager: Changing view acls to:
>>> karthik
>>> 14/06/23 10:44:55 INFO spark.SecurityManager: SecurityManager:
>>> authentication disabled; ui acls disabled; users with view permissions:
>>> Set(karthik)
>>> 14/06/23 10:44:55 INFO slf4j.Slf4jLogger: Slf4jLogger started
>>> 14/06/23 10:44:55 INFO Remoting: Starting remoting
>>> 14/06/23 10:44:55 INFO Remoting: Remoting started; listening on
>>> addresses :[akka.tcp://spark@karthik-OptiPlex-9020:50294]
>>> 14/06/23 10:44:55 INFO Remoting: Remoting now listens on addresses:
>>> [akka.tcp://spark@karthik-OptiPlex-9020:50294]
>>> 14/06/23 10:44:55 INFO spark.SparkEnv: Registering MapOutputTracker
>>> 14/06/23 10:44:55 INFO spark.SparkEnv: Registering BlockManagerMaster
>>> 14/06/23 10:44:55 INFO storage.DiskBlockManager: Created local directory
>>> at /tmp/spark-local-20140623104455-3297
>>> 14/06/23 10:44:55 INFO storage.MemoryStore: MemoryStore started with
>>> capacity 294.6 MB.
>>> 14/06/23 10:44:55 INFO network.ConnectionManager: Bound socket to port
>>> 60264 with id = ConnectionManagerId(karthik-OptiPlex-9020,60264)
>>> 14/06/23 10:44:55 INFO storage.BlockManagerMaster: Trying to register
>>> BlockManager
>>> 14/06/23 10:44:55 INFO storage.BlockManagerInfo: Registering block
>>> manager karthik-OptiPlex-9020:60264 with 294.6 MB RAM
>>> 14/06/23 10:44:55 INFO storage.BlockManagerMaster: Registered
>>> BlockManager
>>> 14/06/23 10:44:55 INFO spark.HttpServer: Starting HTTP Server
>>> 14/06/23 10:44:55 INFO server.Server: jetty-8.y.z-SNAPSHOT
>>> 14/06/23 10:44:55 INFO server.AbstractConnector: Started
>>> SocketConnector@0.0.0.0:38307
>>> 14/06/23 10:44:55 INFO broadcast.HttpBroadcast: Broadcast server started
>>> at http://10.0.1.61:38307
>>> 14/06/23 10:44:55 INFO spark.HttpFileServer: HTTP File server directory
>>> is /tmp/spark-082a44f6-e877-48cc-8ab7-1bcbcf8136b0
>>> 14/06/23 10:44:55 INFO spark.HttpServer: Starting HTTP Server
>>> 14/06/23 10:44:55 INFO server.Server: jetty-8.y.z-SNAPSHOT
>>> 14/06/23 10:44:55 INFO server.AbstractConnector: Started
>>> SocketConnector@0.0.0.0:58745
>>> 14/06/23 10:44:56 INFO server.Server: jetty-8.y.z-SNAPSHOT
>>> 14/06/23 10:44:56 INFO server.AbstractConnector: Started
>>> SelectChannelConnector@0.0.0.0:4040
>>> 14/06/23 10:44:56 INFO ui.SparkUI: Started SparkUI at
>>> http://karthik-OptiPlex-9020:4040
>>> 14/06/23 10:44:56 WARN util.NativeCodeLoader: Unable to load
>>> native-hadoop library for your platform... using builtin-java classes where
>>> applicable
>>> 14/06/23 10:44:56 INFO client.AppClient$ClientActor: Connecting to
>>> master spark://localhost:7077...
>>> 14/06/23 10:44:56 INFO repl.SparkILoop: Created spark context..
>>> 14/06/23 10:44:56 WARN client.AppClient$ClientActor: Could not connect
>>> to akka.tcp://sparkMaster@localhost:7077:
>>> akka.remote.EndpointAssociationException: Association failed with
>>> [akka.tcp://sparkMaster@localhost:7077]
>>> Spark context available as sc.
>>>
>>> scala> 14/06/23 10:44:56 WARN client.AppClient$ClientActor: Could not
>>> connect to akka.tcp://sparkMaster@localhost:7077:
>>> akka.remote.EndpointAssociationException: Association failed with
>>> [akka.tcp://sparkMaster@localhost:7077]
>>> 14/06/23 10:44:56 WARN client.AppClient$ClientActor: Could not connect
>>> to akka.tcp://sparkMaster@localhost:7077:
>>> akka.remote.EndpointAssociationException: Association failed with
>>> [akka.tcp://sparkMaster@localhost:7077]
>>> 14/06/23 10:44:56 WARN client.AppClient$ClientActor: Could not connect
>>> to akka.tcp://sparkMaster@localhost:7077:
>>> akka.remote.EndpointAssociationException: Association failed with
>>> [akka.tcp://sparkMaster@localhost:7077]
>>> 14/06/23 10:45:16 INFO client.AppClient$ClientActor: Connecting to
>>> master spark://localhost:7077...
>>> 14/06/23 10:45:16 WARN client.AppClient$ClientActor: Could not connect
>>> to akka.tcp://sparkMaster@localhost:7077:
>>> akka.remote.EndpointAssociationException: Association failed with
>>> [akka.tcp://sparkMaster@localhost:7077]
>>> 14/06/23 10:45:16 WARN client.AppClient$ClientActor: Could not connect
>>> to akka.tcp://sparkMaster@localhost:7077:
>>> akka.remote.EndpointAssociationException: Association failed with
>>> [akka.tcp://sparkMaster@localhost:7077]
>>> 14/06/23 10:45:16 WARN client.AppClient$ClientActor: Could not connect
>>> to akka.tcp://sparkMaster@localhost:7077:
>>> akka.remote.EndpointAssociationException: Association failed with
>>> [akka.tcp://sparkMaster@localhost:7077]
>>> 14/06/23 10:45:16 WARN client.AppClient$ClientActor: Could not connect
>>> to akka.tcp://sparkMaster@localhost:7077:
>>> akka.remote.EndpointAssociationException: Association failed with
>>> [akka.tcp://sparkMaster@localhost:7077]
>>> 14/06/23 10:45:36 INFO client.AppClient$ClientActor: Connecting to
>>> master spark://localhost:7077...
>>> 14/06/23 10:45:36 WARN client.AppClient$ClientActor: Could not connect
>>> to akka.tcp://sparkMaster@localhost:7077:
>>> akka.remote.EndpointAssociationException: Association failed with
>>> [akka.tcp://sparkMaster@localhost:7077]
>>> 14/06/23 10:45:36 WARN client.AppClient$ClientActor: Could not connect
>>> to akka.tcp://sparkMaster@localhost:7077:
>>> akka.remote.EndpointAssociationException: Association failed with
>>> [akka.tcp://sparkMaster@localhost:7077]
>>> 14/06/23 10:45:36 WARN client.AppClient$ClientActor: Could not connect
>>> to akka.tcp://sparkMaster@localhost:7077:
>>> akka.remote.EndpointAssociationException: Association failed with
>>> [akka.tcp://sparkMaster@localhost:7077]
>>> 14/06/23 10:45:36 WARN client.AppClient$ClientActor: Could not connect
>>> to akka.tcp://sparkMaster@localhost:7077:
>>> akka.remote.EndpointAssociationException: Association failed with
>>> [akka.tcp://sparkMaster@localhost:7077]
>>>
>>>
>>>
>>>  Thanks in advance!!!
>>>
>>
>>
>

Reply via email to