Github user matyix commented on the issue:

    https://github.com/apache/zeppelin/pull/2637
  
    @echarles Currently there are two ways to add external dependencies: add a 
new paragraph to the notebook using `spark.dep` interpreter and `z.load()`. 
This works because it downloads the dependencies inside the driver. There is an 
issue with this on Spark 2.2 and Scala 2.11.8 similar to 
https://issues.apache.org/jira/browse/ZEPPELIN-2475. Adding artifacts to 
Interpreter setting doesn’t work in case of Spark `cluster` mode, since 
dependencies are downloaded locally. Maybe we can think about something like: 
in case of `deployMode=cluster` add artifact to --packages automatically and 
don’t download it locally.


---

Reply via email to