Hi, I met this issue before.
the reason is the hbase client using in spark is 0.94.6, and your server is
0.96.1.1
to fix this issue, you could choose one way:
a) deploy a hbase cluster with version 0.94.6
b) rebuild the spark code
step 1: modify the hbase version in pom.xml to 0.96.1.1
step 2: modify the hbase artifactId in example/pom.xml to hbase-it
step 3: use maven to build spark again
c) try to add hbase jars to SPARK_CLASSPATH ( i did not try this way before
)
2014-07-04 1:19 GMT-07:00 N.Venkata Naga Ravi <[email protected]>:
> Hi,
>
> Any update on the solution? We are still facing this issue...
> We could able to connect to HBase with independent code, but getting issue
> with Spark integration.
>
> Thx,
> Ravi
>
> ------------------------------
> From: [email protected]
> To: [email protected]; [email protected]
> Subject: RE: Spark with HBase
> Date: Sun, 29 Jun 2014 15:32:42 +0530
>
> [email protected]
>
> ------------------------------
> From: [email protected]
> To: [email protected]
> Subject: Spark with HBase
> Date: Sun, 29 Jun 2014 15:28:43 +0530
>
> I am using follwoing versiongs ..
>
> *spark-1.0.0*-bin-hadoop2
> *hbase-0.96.1.1*-hadoop2
>
>
> When executing Hbase Test , i am facing following exception. Looks like
> some version incompatibility, can you please help on it.
>
> NERAVI-M-70HY:spark-1.0.0-bin-hadoop2 neravi$ ./bin/run-example
> org.apache.spark.examples.HBaseTest local localhost:4040 test
>
>
>
> 14/06/29 15:14:14 INFO RecoverableZooKeeper: The identifier of this
> process is [email protected]
> 14/06/29 15:14:14 INFO ClientCnxn: Opening socket connection to server
> localhost/0:0:0:0:0:0:0:1:2181. Will not attempt to authenticate using SASL
> (unknown error)
> 14/06/29 15:14:14 INFO ClientCnxn: Socket connection established to
> localhost/0:0:0:0:0:0:0:1:2181, initiating session
> 14/06/29 15:14:14 INFO ClientCnxn: Session establishment complete on
> server localhost/0:0:0:0:0:0:0:1:2181, sessionid = 0x146e6fa10750009,
> negotiated timeout = 40000
> Exception in thread "main" java.lang.IllegalArgumentException: Not a
> host:port pair: PBUF
>
>
> 192.168.1.6�����(
> at
> org.apache.hadoop.hbase.util.Addressing.parseHostname(Addressing.java:60)
> at org.apache.hadoop.hbase.ServerName.<init>(ServerName.java:101)
> at
> org.apache.hadoop.hbase.ServerName.parseVersionedServerName(ServerName.java:283)
> at
> org.apache.hadoop.hbase.MasterAddressTracker.bytesToServerName(MasterAddressTracker.java:77)
> at
> org.apache.hadoop.hbase.MasterAddressTracker.getMasterAddress(MasterAddressTracker.java:61)
> at
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getMaster(HConnectionManager.java:703)
> at
> org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:126)
> at org.apache.spark.examples.HBaseTest$.main(HBaseTest.scala:37)
> at org.apache.spark.examples.HBaseTest.main(HBaseTest.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:483)
> at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:292)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
>
> Thanks,
> Ravi
>