Github user echarles commented on the issue:

    https://github.com/apache/zeppelin/pull/2637
  
    @matyix I've given a try on you last commit and can not get the additional 
deps (in settings page) working.
    
    I don't see the `spark.jars` property in the generate command (by 
interpreter.sh):
    
    ```
    /opt/spark/bin/spark-submit --class 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer 
--driver-class-path 
":/opt/zeppelin/interpreter/spark/*:/opt/zeppelin/lib/interpreter/*::/opt/zeppelin/interpreter/spark/zeppelin-spark_2.11-0.8.0-SNAPSHOT.jar:/etc/hdfs-k8s/conf"
 --driver-java-options " -Dfile.encoding=UTF-8 
-Dlog4j.configuration=file:///opt/zeppelin/conf/log4j.properties 
-Dzeppelin.log.file=/opt/zeppelin/logs/zeppelin-interpreter---zeppelin-k8s-hdfs-locality-zeppelin-7cd554b49d-dpq2k.log"
 --master k8s://https://kubernetes:443 --conf spark.cores.max='1' --conf 
spark.shuffle.service.enabled='false' --conf 
spark.yarn.dist.archives=/opt/spark/R/lib/sparkr.zip --conf 
spark.executor.instances='3' --conf spark.sql.catalogImplementation='in-memory' 
--conf spark.app.name='zeppelin-k8s-spark' --conf spark.executor.memory='1g' 
--conf spark.master='k8s://https://kubernetes:443' --conf 
spark.kubernetes.namespace='default' --conf 
spark.kubernetes.executor.docker.image='datalaye
 r/spark-k8s-executor:2.2.0-0.5.0' --conf 
spark.kubernetes.driver.docker.image='datalayer/spark-k8s-driver:2.2.0-0.5.0' 
--conf 
spark.kubernetes.initcontainer.docker.image='datalayer/spark-k8s-init:2.2.0-0.5.0'
 --conf spark.kubernetes.resourceStagingServer.uri='http://10.108.197.6:10000' 
--conf
    ```


---

Reply via email to