Exception with fields removed:

Error: java.lang.RuntimeException:
org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while
processing row {

at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:544)
at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:157)
... 8 more Caused by: java.lang.NumberFormatException: Assignment would
result in truncation at
org.apache.hadoop.hive.common.type.HiveDecimal.<init>(HiveDecimal.java:53)
at
org.apache.hadoop.hive.common.type.HiveDecimal.<init>(HiveDecimal.java:47)
at org.apache.hadoop.hive.common.type.HiveDecimal.add(HiveDecimal.java:178)
at
org.apache.hadoop.hive.ql.io.orc.ColumnStatisticsImpl$DecimalStatisticsImpl.merge(ColumnStatisticsImpl.java:494)
at
org.apache.hadoop.hive.ql.io.orc.WriterImpl$TreeWriter.createRowIndexEntry(WriterImpl.java:576)
at
org.apache.hadoop.hive.ql.io.orc.WriterImpl$TreeWriter.createRowIndexEntry(WriterImpl.java:583)
at
org.apache.hadoop.hive.ql.io.orc.WriterImpl.createRowIndexEntry(WriterImpl.java:1683)
at org.apache.hadoop.hive.ql.io.orc.WriterImpl.addRow(WriterImpl.java:1855)
at
org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat$OrcRecordWriter.write(OrcOutputFormat.java:75)
at
org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:638)
at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:504) at
org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:842) at
org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:88)
at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:504) at
org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:842) at
org.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(TableScanOperator.java:91)
at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:504) at
org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:842) at
org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:534)
... 9 more Container killed by the ApplicationMaster. Container killed on
request. Exit code is 143


On Fri, Jan 10, 2014 at 4:32 PM, Kristopher Kane <kkane.l...@gmail.com>wrote:

> Hive .12 on Hadoop 2
>
> I have a table with a mix of STRING and DECIMAL fields that is stored as
> ORC no compression or partitions.
> I wanted to create a copy of this table with CTAS, stored also as ORC.
> The job fails with NumberFormatException at the HiveDecimal class but I
> can't narrow it down the the field.
>
> The job succeeds if I store the new table with the default storage format.
>
> A bit strange going from ORC to ORC with CTAS isn't it?
>
> Thanks,
>
> -Kris
>

Reply via email to