Hi,
At the beginning, I was wondering myself that too, and I don't know why
hbase-common wasn''t being downloaded and included, so I added it
explicitly.
I was in the process to write that maybe I've solved this weird issue:
apparently the shading worked and the ClassDefNotFound issue was caused
Looking at dependencies for flink-hbase, we have :
[INFO] +- org.apache.hbase:hbase-server:jar:1.3.1:compile
[INFO] | +- org.apache.hbase:hbase-common:jar:1.3.1:compile
[INFO] | +- org.apache.hbase:hbase-protocol:jar:1.3.1:compile
[INFO] | +- org.apache.hbase:hbase-procedure:jar:1.3.1:compile
[
Hi,
why do you need to add hbase-common as a separate dependency? Doesn't the
"flink-hbase" dependency transitively pull in hbase?
On Fri, Aug 25, 2017 at 6:35 PM, Ted Yu wrote:
> If Guava 18.0 is used to build hbase 1.3, there would be compilation
> errors such as the following:
>
> [ERROR] /m
If Guava 18.0 is used to build hbase 1.3, there would be compilation errors
such as the following:
[ERROR] /mnt/disk2/a/1.3-h/hbase-server/src/main/java/org/
apache/hadoop/hbase/replication/regionserver/ReplicationSource.java:[271,25]
error: cannot find symbol
[ERROR] symbol: method stopAndWai
Hello everyone, I'm new to Flink and am encountering a nasty problem while
trying to submit a streaming Flink Job. I'll try to explain it as
thoroughly as possible.
Premise: I'm using an HDP 2.6 hadoop cluster, with hadoop version
2.7.3.2.6.1.0-129, Flink compiled from sources accordingly (maven 3