Divya:
Please try not to cross post your question. 

In your case HBase-common jar is needed. To find all the hbase jars needed, you 
can run 'mvn dependency:tree' and check its output. 

> On Feb 29, 2016, at 1:48 AM, Divya Gehlot <divya.htco...@gmail.com> wrote:
> 
> Hi,
> I am trying to access hive table which been created using HbaseIntegration 
> I am able to access data in Hive CLI 
> But when I am trying to access the table using hivecontext of Spark
> getting following error 
>> java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/util/Bytes
>>         at 
>> org.apache.hadoop.hive.hbase.HBaseSerDe.parseColumnsMapping(HBaseSerDe.java:184)
>>         at 
>> org.apache.hadoop.hive.hbase.HBaseSerDeParameters.<init>(HBaseSerDeParameters.java:73)
>>         at 
>> org.apache.hadoop.hive.hbase.HBaseSerDe.initialize(HBaseSerDe.java:117)
>>         at 
>> org.apache.hadoop.hive.serde2.AbstractSerDe.initialize(AbstractSerDe.java:53)
>>         at 
>> org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:521)
>>         at 
>> org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:391)
>>         at 
>> org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:276)
>>         at 
>> org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:258)
>>         at org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:605)
>>         at 
>> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1$$anonfun$3.apply(ClientWrapper.scala:330)
>>         at 
>> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1$$anonfun$3.apply(ClientWrapper.scala:325)
> 
> 
> 
> Have added following jars to Spark class path :
> /usr/hdp/2.3.4.0-3485/hive/lib/hive-hbase-handler.jar,
> /usr/hdp/2.3.4.0-3485/hive/lib/zookeeper-3.4.6.2.3.4.0-3485.jar,
> /usr/hdp/2.3.4.0-3485/hive/lib/guava-14.0.1.jar,
> /usr/hdp/2.3.4.0-3485/hive/lib/protobuf-java-2.5.0.jar
> 
> Which jar files  am I missing ??
> 
> 
> Thanks,
> Regards,
> Divya 

Reply via email to