Glad to hear that it works!
Actually, there is document
https://zeppelin.incubator.apache.org/docs/0.5.5-incubating/interpreter/spark.html
in
dependency management section, but obviously it seems hard to find for new
users. So feel free to improve it.


On Wed, Mar 9, 2016 at 6:05 PM Chris Miller <cmiller11...@gmail.com> wrote:

> Oh, I see. Yeah, that's not documented... no wonder it's confusing. I'll
> open a PR with some improvements to the documentation for this case when I
> have a moment.
>
> Changing spark-default.conf as you suggested indeed worked. Thanks!
>
>
> --
> Chris Miller
>
> On Wed, Mar 9, 2016 at 10:04 AM, mina lee <mina...@apache.org> wrote:
>
>> Hi Chris,
>>
>> there are several ways to load dependencies to Zeppelin 0.5.5.
>> Using %dep is one of them.
>> If you want do it by setting spark.jars.packages property, proper way of
>> doing it is editing your SPARK_HOME/conf/spark-default.conf
>> and adding below line.(I assume that you set SPARK_HOME in
>> ZEPPELIN_HOME/conf/zeppelin-env.sh)
>>
>> spark.jars.packages   org.apache.avro:avro:1.8.0,org.
>> joda:joda-convert:1.8.1
>>
>> The reason you can import avro dependency is that spark assembly already
>> includes avro dependencies, not because you added it in Zeppelin
>> interpreter setting.
>>
>> You can add dependencies via GUI with the latest master
>> branch(0.6.0-incubating-SNAPSHOT) which is experimental at the moment.
>> Please let me know it answers your question.
>>
>> Regards,
>> Mina
>>
>> On Wed, Mar 9, 2016 at 1:41 AM Chris Miller <cmiller11...@gmail.com>
>> wrote:
>>
>>> Hi,
>>>
>>> I have a strange situation going on. I'm running Zeppelin 0.5.5 and
>>> Spark 1.6.0 (on Amazon EMR). I added this property to the interpreter
>>> settings (and restarted it):
>>>
>>>
>>> spark.jars.packages: org.apache.avro:avro:1.8.0,org.joda:joda-convert:1.8.1
>>>
>>> The avro dependency loads fine and I'm able to import and use it.
>>> However, if I try to import something in the joda-convert package (such as,
>>> org.joda.convert.FromString), I get an error that "error: object convert is
>>> not a member of package org.joda".
>>>
>>> If I run the spark-shell from the CLI and include the same string above
>>> in the --package parameter, I'm able to import joda-convert just fine.
>>> Also, if I restart the interpreter and manually import the dependency with
>>> z.load(), it also works fine:
>>>
>>> %dep
>>> z.load("org.joda:joda-convert:1.8.1")
>>>
>>> So, what's going on here?
>>>
>>> --
>>> Chris Miller
>>>
>>
>

Reply via email to