Hi,

I am trying to deploy a Spark app in a Kubernetes Cluster. The cluster consists 
of 2 machines - 1 master and 1 slave, each of them with the following config:
RHEL 7.2
Docker 17.03.1
K8S 1.7.

I am following the steps provided in 
https://apache-spark-on-k8s.github.io/userdocs/running-on-kubernetes.html 
<https://apache-spark-on-k8s.github.io/userdocs/running-on-kubernetes.html>

When I submit an application (SparkPi), a driver pod is created on the slave 
machine of the cluster. But it exits with an exception:

2017-10-09 22:13:24 INFO  SecurityManager:54 - SecurityManager: authentication 
disabled; ui acls disabled; users  with view permissions: Set(root); groups 
with view permissions: Set(); users  with modify permissions: Set(root); groups 
with modify permissions: Set()
2017-10-09 22:13:30 ERROR SparkContext:91 - Error initializing SparkContext.
java.nio.channels.UnresolvedAddressException
        at sun.nio.ch.Net.checkAddress(Net.java:101)
        at 
sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:218)
        at 
io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127)
        at 
io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501)
        at 
io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:496)
        at 
io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:481)
        at 
io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965)
        at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210)
        at 
io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353)
        at 
io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:399)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:446)
        at 
io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
        at 
io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
        at java.lang.Thread.run(Thread.java:748)
2017-10-09 22:13:30 INFO  SparkContext:54 - Successfully stopped SparkContext

Has anyone come across this problem or know why this might be happening?

Thanks,
Suman.

Reply via email to