Hi,
you can locate them under ${ZEPPELIN_HOME}/interpreter/spark/
The other option is to set dependencies in interpreter tab of spark
interpreter. See here:
http://zeppelin.apache.org/docs/0.7.3/manual/dependencymanagement.html
I'm not sure if sc.something works or not.
Regards,
JL
On Thu, Oct
Hi,
I have those libraries but where to place those libraries so that zeppelin
can pick up.
or is there any way to set the library path using sparkcontext i.e using sc?
Regards
In
On Wed, Oct 25, 2017 at 9:22 PM, Jongyoul Lee wrote:
> Hi,
>
> I'm not sure but you can try to locate them under
Hi,
I'm not sure but you can try to locate them under interpreter/spark if you
can do it
JL
On Wed, Oct 25, 2017 at 3:05 PM, Indtiny S wrote:
> Hi,
> I am trying to read Hbase tables in pyspark data frame,
> I am using the below code
> but I am getting the ClassNotFoundException error
>
> df=
please help on this .
On Wed, Oct 25, 2017 at 11:35 AM, Indtiny S wrote:
> Hi,
> I am trying to read Hbase tables in pyspark data frame,
> I am using the below code
> but I am getting the ClassNotFoundException error
>
> df=sqlContext.read.format('jdbc').options(driver="org.
> apache.phoenix.j
Hi,
I am trying to read Hbase tables in pyspark data frame,
I am using the below code
but I am getting the ClassNotFoundException error
df=sqlContext.read.format('jdbc').options(driver="org.apache.phoenix.jdbc.PhoenixDriver",url='jdbc:phoenix:localhost:2181:/hbase-unsecure',dbtable='table_name')