Looking at your classpath, it looks like you've compiled Spark yourself.
Depending on which version of Hadoop you've compiled against (looks like
it's Hadoop 2.2 in your case), Spark will have its own version of
protobuf. You should try by making sure both your HBase and Spark are
compiled against
The error indicates incompatible protobuf versions.
Please take a look at 4.1.1 under
http://hbase.apache.org/book.html#basic.prerequisites
Cheers
On Thu, Apr 30, 2015 at 3:49 AM, Saurabh Gupta
wrote:
> Now able to solve the issue by setting
>
> SparkConf sconf = *new* SparkConf().setAppName(“
Now able to solve the issue by setting
SparkConf sconf = *new* SparkConf().setAppName(“App").setMaster("local")
and
conf.set(“zookeeper.znode.parent”, “/hbase-unsecure”)
Standalone hbase has a table 'test'
hbase(main):001:0> scan 'test'
ROW COLUMN+CELL
row1
I am using hbase -0.94.8.
On Wed, Apr 29, 2015 at 11:56 PM, Ted Yu wrote:
> Can you enable HBase DEBUG logging in log4j.properties so that we can have
> more clue ?
>
> What hbase release are you using ?
>
> Cheers
>
> On Wed, Apr 29, 2015 at 4:27 AM, Saurabh Gupta
> wrote:
>
>> Hi,
>>
>> I am
Can you enable HBase DEBUG logging in log4j.properties so that we can have
more clue ?
What hbase release are you using ?
Cheers
On Wed, Apr 29, 2015 at 4:27 AM, Saurabh Gupta
wrote:
> Hi,
>
> I am working with standalone HBase. And I want to execute HBaseTest.scala
> (in scala examples) .
>
>