setting ZEPPELIN_CLASSPATH in zeppelin-env.sh worked just fine - thank you!
- gerald
--
Gerald Loeffler
mailto:gerald.loeff...@googlemail.com
http://www.gerald-loeffler.net
On Sat, Jun 4, 2016 at 2:36 AM, moon soo Lee wrote:
> Hi Gerald,
>
> If you don't export SPARK_HOME, could you try add jar
Thanks alot. For quick test I have added using the %dep and it worked like
a charm.
https://zeppelin.incubator.apache.org/docs/latest/interpreter/spark.html
I will try using the CLASSPATH too. Appreciate your help
Kishore
On Fri, Jun 3, 2016 at 5:36 PM, moon soo Lee wrote:
> Hi Gerald,
Hi Gerald,
If you don't export SPARK_HOME, could you try add jar using
export ZEPPELIN_CLASSPATH=/path/to/your.jar
that supposed to add your jar in the classpath of spark interpreter process.
Hope this helps.
Best,
moon
On Fri, Jun 3, 2016 at 1:02 AM Gerald Loeffler <
gerald.loeff...@googlemai
JL,
would it be possible to show the config settings for the 2nd option, i.e.
without a separate SPARK_HOME (and therefore using Zeppelin's
built-in Spark distribution)? That would be very helpful!
thank you in advance
gerald
On Wednesday, 1 June 2016, Jongyoul Lee wrote:
> Hi,
>
> General
Hi,
Generally, if you have your own "SPARK_HOME" settings, it's OK to set
SPARK_HOME in zeppelin-env.sh and set your arguments into interpreter tab.
Otherwise, you should add Zeppelin classpath and set arguments in the
interpreter tab with "--jars".
Hope this help,
JL
On Wed, Jun 1, 2016 at 5:55
Hi
We have built custom data provider from which we read the data and
eventually create a data frame. Is it possible to add the custom data
providers in to the Zeppelin classpath. If so can you share your thoughts
on how we can make it.
Thanks
K