This could be because phoenix-server jar missing in HBase classpath.

Could you please copy phoenix-server jar from phoenix into $HBASE_HOME/lib
path in all the nodes, restart HBase and start sqlline.

https://phoenix.apache.org/installation.html


Thanks,
Rajeshbabu.

On Sun, Aug 20, 2023, 4:34 AM Kal Stevens <kalgstev...@gmail.com> wrote:

> I am new to setting up a cluster, and I feel like I am doing something
> dumb.
>
> I get the following error message when I run sqlline to create a table,
> and I am not sure what I am doing wrong.
>
> I know that in on the classpath for SQLLine, but I think it is making an
> RPC call to hbase. But I am not sure because I do not see anything
> incorrect about this in the hbase logs
>
> These are the arguments that I am using for SqlLine (I am doing this in
> the IDE to debug it not the command line)
>
> -d org.apache.phoenix.jdbc.PhoenixEmbeddedDriver -u
> jdbc:phoenix:zookeeper:2181:/hbase:/keytab -n none -p none --color=true
> --fastConnect=false --verbose=true --incremental=false
> --isolation=TRANSACTION_READ_COMMITTED
>
>
> I am not sure what /hbase:/keytab are supposed to be
>
> It is trying to connect to this host/port
>
> hbase/192.168.1.162:16000
>
>
> It seems to be trying to create the "SYSTEM.CATALOG"
>
>
>
> Caused by:
> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.DoNotRetryIOException):
> org.apache.hadoop.hbase.DoNotRetryIOException: Unable to load configured
> region split policy 'org.apache.phoenix.schema.MetaDataSplitPolicy' for
> table 'SYSTEM.CATALOG' Set hbase.table.sanity.checks to false at conf or
> table descriptor if you want to bypass sanity checks
> at
> org.apache.hadoop.hbase.util.TableDescriptorChecker.warnOrThrowExceptionForFailure(TableDescriptorChecker.java:339)
> at
> org.apache.hadoop.hbase.util.TableDescriptorChecker.checkClassLoading(TableDescriptorChecker.java:331)
> at
> org.apache.hadoop.hbase.util.TableDescriptorChecker.sanityCheck(TableDescriptorChecker.java:110)
> at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:2316)
> at
> org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:691)
> at
> org.apache.hadoop.hbase.shaded.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:415)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:124)
> at org.apache.hadoop.hbase.ipc.RpcHandler.run(RpcHandler.java:102)
> at org.apache.hadoop.hbase.ipc.RpcHandler.run(RpcHandler.java:82)
> Caused by: java.io.IOException: Unable to load configured region split
> policy 'org.apache.phoenix.schema.MetaDataSplitPolicy' for table
> 'SYSTEM.CATALOG'
> at
> org.apache.hadoop.hbase.regionserver.RegionSplitPolicy.getSplitPolicyClass(RegionSplitPolicy.java:122)
> at
> org.apache.hadoop.hbase.util.TableDescriptorChecker.checkClassLoading(TableDescriptorChecker.java:328)
> ... 8 more
> Caused by: java.lang.ClassNotFoundException:
> org.apache.phoenix.schema.MetaDataSplitPolicy
> at
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581)
> at
> java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
> at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:527)
> at java.base/java.lang.Class.forName0(Native Method)
> at java.base/java.lang.Class.forName(Class.java:315)
> at
> org.apache.hadoop.hbase.regionserver.RegionSplitPolicy.getSplitPolicyClass(RegionSplitPolicy.java:118)
> ... 9 more
>

Reply via email to