What you are describing is possible, but there are a set of
constraints and trade-offs that may not give the experience you want.
Colab faces these same constraints and tradeoffs.

* JupyterLab/classic Notebook have a range of different service APIs -
kernels, contents (file/dirs), terminals, kernel specs (kernel
metadata), etc. The standard way of operating Jupyter is to deploy all
of these services in the same server, and to also serve the frontend
JS/HTML/CSS from that same server. In the abstraction, the
architecture certainly allows you to deploy different services on
different servers. However in practice, it is probably not what you
want to do. The reason is that the different services have coupling
that directly relates to the overall usability:

* Kernels and the file system are coupled, as a kernel needs to see
the same file system - to import other files sitting next to the
notebook, and to access co-located data files. So you can run the
kernel service separately, but it won't be useful for real work unless
you also figure out how to mount the same file system for that kernel.
* Terminals and the file system are coupled as users typically want to
see the real file system in the terminal.
* The frontend and kernels also have a coupling, because many kernel
libraries require frontend extensions to be installed.

These are exactly the tradeoffs made by colab, and they severely
limits its usefulness for real work. But I do think these trade offs
do make sense for some usage cases. Right now, the code base isn't
optimized for this more decoupled usage case, but we have different
efforts moving in this direction.

There is work to separate out the jupyter server and de coupled its
components here:

https://github.com/jupyter/jupyter_server

Unless I am misunderstanding your question, I don't think you are
looking to make changes to the frontend though - so I think the stock
JupyterLab or classic notebook would work fine. What you want is a
different way of factoring the server side of things. Probably the
most flexible way of refactoring the server would be to begin to use
the components of jupyter server, running in separate containers,
managed by something like docker compose with a traefik reverse proxy
in front. I would love to see explorations along these lines.

I should also note that the Jupyter enterprise gateway might help in this work:

https://github.com/jupyter/kernel_gateway

Hope this helps!

On Sat, Apr 27, 2019 at 1:50 AM Song Liu <[email protected]> wrote:
>
> Hi,
>
> Just like the Google Colab user experience, is that possible to separate the 
> JupyterLab Web and Backend ?
>
> That is the notebook viewing and editing functionality could be provided by 
> the JupyterLab web service just with minimum resource consumption.
> When running the Python code it could send it to a remote JupyterLab backend 
> (Server and Kernel).
>
> Thanks,
> Song
>
> --
> You received this message because you are subscribed to the Google Groups 
> "Project Jupyter" group.
> To unsubscribe from this group and stop receiving emails from it, send an 
> email to [email protected].
> To post to this group, send email to [email protected].
> To view this discussion on the web visit 
> https://groups.google.com/d/msgid/jupyter/d9872648-8909-4ad8-84f7-853ae8c81460%40googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.



-- 
Brian E. Granger
Associate Professor of Physics and Data Science
Cal Poly State University, San Luis Obispo
@ellisonbg on Twitter and GitHub
[email protected] and [email protected]

-- 
You received this message because you are subscribed to the Google Groups 
"Project Jupyter" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jupyter/CAH4pYpTTtNK71w9ewRvO_OeyXUg6u3HV5smUMQpnsFC8Cg%3DP%3DQ%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to