You can do it via livy interpreter setting.

Here's 2 configuration which can help you add external jars and external
packages

livy.spark.jars
livy.spark.jars.packages

And this is the configuration for queue name

livy.spark.yarn.queue


Anandha L Ranganathan <analog.s...@gmail.com>于2017年11月22日周三 上午9:13写道:

> We are using Livy interpreter from Zeppelin to connect to Spark.
>
> In this,  we want to give the users an option to download the external
> libraries.
> By default we have added some basic libraries in interpreter setting.
>
> In spark interpreter, an users can download the external libraries they
> want using this command.
> %spark.dep
> z.reset()
> z.addRepo("Spark Packages Repo").url("
> http://dl.bintray.com/spark-packages/maven";)
> z.load("com.databricks:spark-csv_2.11:1.2.0")
>
>
> How can we import the external libraries using livy ?
>
>
> Another question, is there a way to change the yarn queue name at runtime?
> Some users want to use different queue rather than default queue assigned
> in the interpreter.  If that feature is not available, then what is the
> best approach to implement this ?
>
> Thanks
> Anand
>
>
>

Reply via email to