[ https://issues.apache.org/jira/browse/HIVE-15883?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15868504#comment-15868504 ]
Aihua Xu commented on HIVE-15883: --------------------------------- The change looks good to me. Can we add a unit test to cover this? > HBase mapped table in Hive insert fail for decimal > -------------------------------------------------- > > Key: HIVE-15883 > URL: https://issues.apache.org/jira/browse/HIVE-15883 > Project: Hive > Issue Type: Bug > Components: Hive > Affects Versions: 1.1.0 > Reporter: Naveen Gangam > Assignee: Naveen Gangam > Attachments: HIVE-15883.patch > > > CREATE TABLE hbase_table ( > id int, > balance decimal(15,2)) > ROW FORMAT DELIMITED > COLLECTION ITEMS TERMINATED BY '~' > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' > WITH SERDEPROPERTIES ( > "hbase.columns.mapping"=":key,cf:balance#b"); > insert into hbase_table values (1,1); > ---- > Diagnostic Messages for this Task: > Error: java.lang.RuntimeException: > org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while > processing row {"tmp_values_col1":"1","tmp_values_col2":"1"} > at > org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:179) > at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54) > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:415) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1783) > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) > Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime > Error while processing row {"tmp_values_col1":"1","tmp_values_col2":"1"} > at > org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:507) > at > org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:170) > ... 8 more > Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: > org.apache.hadoop.hive.serde2.SerDeException: java.lang.RuntimeException: > Hive internal error. > at > org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:733) > at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815) > at > org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84) > at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815) > at > org.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(TableScanOperator.java:97) > at > org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:157) > at > org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:497) > ... 9 more > Caused by: org.apache.hadoop.hive.serde2.SerDeException: > java.lang.RuntimeException: Hive internal error. > at > org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:286) > at > org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:668) > ... 15 more > Caused by: java.lang.RuntimeException: Hive internal error. > at > org.apache.hadoop.hive.serde2.lazy.LazyUtils.writePrimitive(LazyUtils.java:328) > at > org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:220) > at > org.apache.hadoop.hive.hbase.HBaseRowSerializer.serializeField(HBaseRowSerializer.java:194) > at > org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:118) > at > org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:282) > ... 16 more -- This message was sent by Atlassian JIRA (v6.3.15#6346)