I'm using 1.1.0.
I always thought these issues were resolved way back at 0.13-0.14.
So rewriting the data is the only way to handle this?

Thank you,
Daniel



On Wed, Dec 14, 2016 at 8:42 PM, Owen O'Malley <omal...@apache.org> wrote:

> Which version of Hive are you on? Hive 2.1 should automatically handle the
> type conversions from the file to the table.
>
> .. Owen
>
> On Wed, Dec 14, 2016 at 9:36 AM, Daniel Haviv <
> daniel.ha...@veracity-group.com> wrote:
>
>> Hi,
>> I have an ORC table where one of the fields was an int and is now a
>> bigint.
>> Whenever I query a partition before the schema change I encounter the
>> following error:
>> Error: java.io.IOException: java.io.IOException:
>> java.lang.ClassCastException: org.apache.hadoop.io.IntWritable cannot be
>> cast to org.apache.hadoop.io.LongWritable
>>         at org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handle
>> RecordReaderNextException(HiveIOExceptionHandlerChain.java:121)
>>         at org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleR
>> ecordReaderNextException(HiveIOExceptionHandlerUtil.java:77)
>>         at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRe
>> cordReader.doNextWithExceptionHandler(HadoopShimsSecure.java:226)
>>         at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRe
>> cordReader.next(HadoopShimsSecure.java:136)
>>         at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToN
>> ext(MapTask.java:199)
>>         at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.next(
>> MapTask.java:185)
>>         at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:52)
>>         at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:
>> 453)
>>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
>>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at javax.security.auth.Subject.doAs(Subject.java:415)
>>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGro
>> upInformation.java:1698)
>>
>> I tried to manually go through the old partitions and set that column to
>> int but I'm still getting the same exceptions.
>> I expected promoting an int to a bigint shouldn't cause any problems.
>>
>> Am I doing something wrong ?
>>
>> Thank you,
>> Daniel
>>
>
>

Reply via email to