Hi Gerald,

If you don't export SPARK_HOME, could you try add jar using

export ZEPPELIN_CLASSPATH=/path/to/your.jar

that supposed to add your jar in the classpath of spark interpreter process.
Hope this helps.

Best,
moon

On Fri, Jun 3, 2016 at 1:02 AM Gerald Loeffler <
gerald.loeff...@googlemail.com> wrote:

> JL,
>
> would it be possible to show the config settings for the 2nd option, i.e.
> without a separate SPARK_HOME (and therefore using Zeppelin's
> built-in Spark distribution)? That would be very helpful!
>
>   thank you in advance
>   gerald
>
>
> On Wednesday, 1 June 2016, Jongyoul Lee <jongy...@gmail.com> wrote:
>
>> Hi,
>>
>> Generally, if you have your own "SPARK_HOME" settings, it's OK to set
>> SPARK_HOME in zeppelin-env.sh and set your arguments into interpreter tab.
>> Otherwise, you should add Zeppelin classpath and set arguments in the
>> interpreter tab with "--jars".
>>
>> Hope this help,
>> JL
>>
>> On Wed, Jun 1, 2016 at 5:55 PM, kishore r <vkishore...@gmail.com> wrote:
>>
>>> Hi
>>>
>>> We have built custom data provider from which we read the data and
>>> eventually create a data frame. Is it possible to add the custom data
>>> providers in to the Zeppelin classpath. If so can you share your thoughts
>>> on how we can make it.
>>>
>>> Thanks
>>> K
>>>
>>
>>
>>
>> --
>> 이종열, Jongyoul Lee, 李宗烈
>> http://madeng.net
>>
>
>
> --
> --
> Gerald Loeffler
> mailto:gerald.loeff...@googlemail.com
> http://www.gerald-loeffler.net
>
>

Reply via email to