Hi,

you can locate them under ${ZEPPELIN_HOME}/interpreter/spark/

The other option is to set dependencies in interpreter tab of spark
interpreter. See here:
http://zeppelin.apache.org/docs/0.7.3/manual/dependencymanagement.html

I'm not sure if sc.something works or not.

Regards,
JL

On Thu, Oct 26, 2017 at 3:43 PM, Indtiny S <indt...@gmail.com> wrote:

> Hi,
>
> I have those libraries but where to place those libraries so that zeppelin
> can pick up.
>
> or is there any way to set the library path using sparkcontext i.e using
> sc?
>
>
> Regards
> In
>
> On Wed, Oct 25, 2017 at 9:22 PM, Jongyoul Lee <jongy...@gmail.com> wrote:
>
> > Hi,
> >
> > I'm not sure but you can try to locate them under interpreter/spark if
> you
> > can do it
> >
> > JL
> >
> > On Wed, Oct 25, 2017 at 3:05 PM, Indtiny S <indt...@gmail.com> wrote:
> >
> > > Hi,
> > > I am trying to read Hbase tables in pyspark data frame,
> > > I am using the below code
> > > but I am getting the ClassNotFoundException error
> > >
> > >  df=sqlContext.read.format('jdbc').options(driver="org.
> > > apache.phoenix.jdbc.PhoenixDriver",url='jdbc:
> > > phoenix:localhost:2181:/hbase-unsecure',dbtable='table_name').load()
> > >
> > >
> > > java.lang.ClassNotFoundException: org.apache.phoenix.jdbc.
> PhoenixDriver
> > >
> > >
> > > I have the libraries phoenix-spark-4.7.0-HBase-1.1.jar and
> > > phoenix-4.7.0-HBase-1.1-client.jar but dont know where to place them .
> > >
> > >
> > > I am using zeppelin 0.7.0
> > >
> > >
> > > Rgds
> > >
> > > In
> > >
> > >
> > >
> > >
> > >
> > >
> >
> >
> > --
> > 이종열, Jongyoul Lee, 李宗烈
> > http://madeng.net
> >
>



-- 
이종열, Jongyoul Lee, 李宗烈
http://madeng.net

Reply via email to