Hello All,

I've been using Apache Zeppelin against Apache Spark clusters and with
PySpark. One of the things I often tend to do is install libraries and
packages on my cluster. For instance I would like numpy, scipy and other
data science libraries present on my cluster for data analysis. However,
the %sh interpreter only works on my Zeppelin host for any pip install
commands.

- How are other users tackling this problem?
- Do you have a base set of libraries always installed?
- Is there a clustered shell interpreter over SSH that Apache Zeppelin
provides?
*(*I looked but didn't find any issues/pull requests related to this ask*)*

Thanks,

Reply via email to