Hi, andy, I think you can make that with some open source packages/libs
built for IPython and  Spark.

here is one : https://github.com/litaotao/IPython-Dashboard

On Thu, Mar 17, 2016 at 1:36 AM, Andy Davidson <
a...@santacruzintegration.com> wrote:

> We are considering deploying a notebook server for use by two kinds of
> users
>
>
>    1. interactive dashboard.
>       1. I.e. Forms allow users to select data sets and visualizations
>       2. Review real time graphs of data captured by our spark streams
>    2. General notebooks for Data Scientists
>
>
> My concern is interactive spark jobs can can consume a lot of cluster
> resource and many users may be sloppy/lazy. I.E. Just kill their browsers
> instead of shutting down their notebooks cleanly
>
> What are best practices?
>
>
> Kind regards
>
> Andy
>



-- 
*--------------------------------------*
a spark lover, a quant, a developer and a good man.

http://github.com/litaotao

Reply via email to