[ https://issues.apache.org/jira/browse/HIVE-10990?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14708003#comment-14708003 ]
meiyoula commented on HIVE-10990: --------------------------------- I also met the same problem in *spark on hbase* function. {quote} ERROR CliDriver: org.apache.spark.sql.execution.QueryExecutionException: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org.apache.hadoop.hbase.HTableDescriptor.addFamily(Lorg/apache/hadoop/hbase/HColumnDescriptor;)V at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$runHive$1.apply(ClientWrapper.scala:433) at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$runHive$1.apply(ClientWrapper.scala:418) at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$withHiveState$1.apply(ClientWrapper.scala:256) at org.apache.spark.sql.hive.client.ClientWrapper.retryLocked(ClientWrapper.scala:211) at org.apache.spark.sql.hive.client.ClientWrapper.withHiveState(ClientWrapper.scala:248) at org.apache.spark.sql.hive.client.ClientWrapper.runHive(ClientWrapper.scala:418) at org.apache.spark.sql.hive.client.ClientWrapper.runSqlHive(ClientWrapper.scala:408) at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:558) at org.apache.spark.sql.hive.execution.HiveNativeCommand.run(HiveNativeCommand.scala:33) at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:57) at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:57) at org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:69) at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:140) at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:138) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147) at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:138) at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:927) at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:927) at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:144) at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:129) at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51) at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:719) at org.apache.spark.sql.hive.thriftserver.SparkSQLDriver.run(SparkSQLDriver.scala:61) at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processCmd(SparkSQLCLIDriver.scala:304) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376) at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:223) at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:675) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:183) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:208) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:123) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) {quote} > Compatibility Hive-1.2 an hbase-1.0.1.1 > --------------------------------------- > > Key: HIVE-10990 > URL: https://issues.apache.org/jira/browse/HIVE-10990 > Project: Hive > Issue Type: Bug > Components: Beeline, HBase Handler, HiveServer2 > Affects Versions: 1.2.0 > Reporter: gurmukh singh > Assignee: Swarnim Kulkarni > > Hive external table works fine with Hbase. > Hive-1.2 and hbase-1.0.1.1, hadoop-2.5.2 > Not able to create a table from hive in hbase. > 1: jdbc:hive2://edge1.dilithium.com:10000/def> TBLPROPERTIES > ("hbase.table.name" = "xyz"); > FAILED: Execution Error, return code 1 from > org.apache.hadoop.hive.ql.exec.DDLTask. > org.apache.hadoop.hbase.HTableDescriptor.addFamily(Lorg/apache/hadoop/hbase/HColumnDescriptor;)V > Error: Error while processing statement: FAILED: Execution Error, return code > 1 from org.apache.hadoop.hive.ql.exec.DDLTask. > org.apache.hadoop.hbase.HTableDescriptor.addFamily(Lorg/apache/hadoop/hbase/HColumnDescriptor;)V > (state=08S01,code=1) > [hdfs@edge1 cluster]$ hive > 2015-06-12 17:56:49,952 WARN [main] conf.HiveConf: HiveConf of name > hive.metastore.local does not exist > Logging initialized using configuration in > jar:file:/usr/local/cluster/apache-hive-1.2.0-bin/lib/hive-common-1.2.0.jar!/hive-log4j.properties > SLF4J: Class path contains multiple SLF4J bindings. > SLF4J: Found binding in > [jar:file:/usr/local/cluster/apache-hive-1.2.0-bin/auxlib/slf4j-log4j12-1.7.7.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: Found binding in > [jar:file:/usr/local/cluster/hadoop-2.5.2/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an > explanation. > SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] > hive> CREATE TABLE hbase_table_1(key int, value string) > > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' > > WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val") > > TBLPROPERTIES ("hbase.table.name" = "xyz"); > FAILED: Execution Error, return code 1 from > org.apache.hadoop.hive.ql.exec.DDLTask. > org.apache.hadoop.hbase.HTableDescriptor.addFamily(Lorg/apache/hadoop/hbase/HColumnDescriptor;)V > ======================= > scan complete in 1535ms > 14 driver classes found > Compliant Version Driver Class > no 5.1 com.mysql.jdbc.Driver > no 5.1 com.mysql.jdbc.NonRegisteringDriver > no 5.1 com.mysql.jdbc.NonRegisteringReplicationDriver > no 5.1 com.mysql.jdbc.ReplicationDriver > yes 1.2 org.apache.calcite.avatica.remote.Driver > yes 1.2 org.apache.calcite.jdbc.Driver > yes 1.0 org.apache.commons.dbcp.PoolingDriver > yes 10.11 org.apache.derby.jdbc.AutoloadedDriver > yes 10.11 org.apache.derby.jdbc.Driver42 > yes 10.11 org.apache.derby.jdbc.EmbeddedDriver > yes 10.11 org.apache.derby.jdbc.InternalDriver > no 1.2 org.apache.hive.jdbc.HiveDriver > yes 1.0 org.datanucleus.store.rdbms.datasource.dbcp.PoolingDriver > no 5.1 org.gjt.mm.mysql.Driver -- This message was sent by Atlassian JIRA (v6.3.4#6332)