Re: Read Hbase table in pyspark

2017-10-25 Thread Indtiny S
please help on this . On Wed, Oct 25, 2017 at 11:35 AM, Indtiny S wrote: > Hi, > I am trying to read Hbase tables in pyspark data frame, > I am using the below code > but I am getting the ClassNotFoundException error > > df=sqlContext.read.format('jdbc').options(driver="org. > apache.phoenix.j

Re: Read Hbase table in pyspark gives java.lang.ClassNotFoundException: org.apache.phoenix.jdbc.PhoenixDriver

2017-10-25 Thread Jongyoul Lee
Hi, I'm not sure but you can try to locate them under interpreter/spark if you can do it JL On Wed, Oct 25, 2017 at 3:05 PM, Indtiny S wrote: > Hi, > I am trying to read Hbase tables in pyspark data frame, > I am using the below code > but I am getting the ClassNotFoundException error > > df=

Re: Multiple Spark Versions / Interpreter on CDH 5.9 Cluster

2017-10-25 Thread Jongyoul Lee
Hi, You can set SPARK_HOME in your interpreter tab instead of setting in zeppelin-env.sh. Remove SPARK_HOME in your zeppelin-env.sh and set it up interpreter tabs with two different spark interpreters. On Tue, Oct 24, 2017 at 11:39 AM, Jeff Zhang wrote: > > Take a look at spark submit command >

Re: Weird error in trying set up a hive interpreter.

2017-10-25 Thread Michael Segel
Hi, You were right. Hive wasn’t up and running. I’ve re-installed MapR, created the interpreter… but then ran in to a curious issue…. If I created a new note and made hive my default interpreter. I could run Hive commands. But… if I took a spark note and then tried to run %hive … within the no

Re: Weird error in trying set up a hive interpreter.

2017-10-25 Thread Jeff Zhang
What do you mean spark note ? There's no concept of spark note. Could you give more details of your problem ? Maybe screenshot is helpful. Michael Segel 于2017年10月26日周四 上午12:07写道: > Hi, > > You were right. Hive wasn’t up and running. > > I’ve re-installed MapR, created the interpreter… but then

Re: Weird error in trying set up a hive interpreter.

2017-10-25 Thread Michael Segel
Notebook with Spark as the default interpreter? Sorry, not sure how to describe it… So… I can create a new note with hive as the default interpreter… everything is ok. Ok, that’s weird… now it works. I just created a spark notebook and then did %hive and then Show databaes; That worked…. O

Re: Read Hbase table in pyspark gives java.lang.ClassNotFoundException: org.apache.phoenix.jdbc.PhoenixDriver

2017-10-25 Thread Indtiny S
Hi, I have those libraries but where to place those libraries so that zeppelin can pick up. or is there any way to set the library path using sparkcontext i.e using sc? Regards In On Wed, Oct 25, 2017 at 9:22 PM, Jongyoul Lee wrote: > Hi, > > I'm not sure but you can try to locate them under