Github user echarles commented on the issue:

    https://github.com/apache/zeppelin/pull/2637
  
    @matyix There is a long history in Zeppelin on `spark.dep` vs `external 
dependency in interpreter settings`. I am fan of the later (interpreter 
settings), so if the --packages flag can make it work, this would be wonderful.
    
    I don't see in Spark doc that --package add the jars on the executor 
classpath.
    
    The `spark.jars`  property (Comma-separated list of local jars to include 
on the driver and executor classpaths) may be an alternative.
    



---

Reply via email to