Not exactly what you want, but I have an example here :
https://github.com/lresende/docker-systemml-notebook

You should be able to accomplish what you want playing with --link which I
did in the example below (but just with Yarn and HDFS)
https://github.com/lresende/docker-yarn-cluster

On Fri, Aug 5, 2016 at 11:07 PM, DuyHai Doan <doanduy...@gmail.com> wrote:

> No, using Docker compose is easy, what I want is:
>
> 1) Zeppelin running inside a Docker container
> 2) Spark deployed in stand-alone mode, running somewhere on bare-metal /
> cloud / Docker but in another network
>
> In this scenario, it's very hard to make the Zeppelin client that is
> living inside the Docker container to communicate with the external Spark
> cluster
>
> On Fri, Aug 5, 2016 at 9:00 PM, George Webster <webste...@gmail.com>
> wrote:
>
>> can you share your docker file? also, are you using docker-compose or
>> just docker-machine
>>
>> On Fri, Aug 5, 2016 at 8:46 PM, DuyHai Doan <doanduy...@gmail.com> wrote:
>>
>>> Hello guys
>>>
>>> Has anyone attempted to run Zeppelin inside Docker that connect to a
>>> real Spark cluster running on the host machine ?
>>>
>>> I've spend a day trying to make it work but unsuccessfully, the job
>>> never completed because the driver program (Zeppelin Spark shell) is
>>> listening on an Internal IP address (the internal IP address of the
>>> container).
>>>
>>> I've tried to run the Docker container with host network (--net=host)
>>> but in this case, I cannot access to Zeppelin through localhost:8080 or
>>> 127.0.0.1:8080
>>>
>>> Anyone has an idea to unblock my use-case ?
>>>
>>
>>
>


-- 
Luciano Resende
http://twitter.com/lresende1975
http://lresende.blogspot.com/

Reply via email to