Hi Abe,

The column in the WHERE clause is also string. I tried creating without any
filters, just a CTAS and that failed too.

Thanks,
Vivek

On Thu, May 21, 2015 at 7:47 PM, Abe Weinograd <a...@flonet.com> wrote:

> I think your WHERE clause is having casting issues.  Is that column of the
> right data type?  Can you run the SELECT by itself without the CREATE TABLE?
>
> Abe
>
> On Wed, May 20, 2015 at 11:23 PM, vivek veeramani <
> vivek.veeraman...@gmail.com> wrote:
>
>>
>> Hi ,
>>
>> I'm trying to create a subset table from a master table which is stored
>> in ORC format. The query I'm using is :
>>
>>  CREATE TABLE tab_20150510 STORED AS ORC AS SELECT * FROM master_tab
>> WHERE col_id_60 <> '2015-05-20';
>>
>> The master table is a table with 60 columns with all string data type but
>> 1 column as bigint which is not part of the query. I tried creating this
>> using Tez as well but no go.
>> The error I get is shown below:
>>
>> -----
>> Diagnostic Messages for this Task:
>> Error: java.io.IOException: java.io.IOException:
>> java.lang.ClassCastException: org.apache.hadoop.io.IntWritable cannot be
>> cast to org.apache.hadoop.io.Text
>>         at
>> org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderNextException(HiveIOExceptionHandlerChain.java:121)
>>         at
>> org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderNextException(HiveIOExceptionHandlerUtil.java:77)
>>         at
>> org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.doNextWithExceptionHandler(HadoopShimsSecure.java:256)
>>         at
>> org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.next(HadoopShimsSecure.java:171)
>>         at
>> org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapTask.java:198)
>>         at
>> org.apache.hadoop.mapred.MapTask$TrackedRecordReader.next(MapTask.java:184)
>>         at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:52)
>>         at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at javax.security.auth.Subject.doAs(Subject.java:415)
>>         at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
>>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>> Caused by: java.io.IOException: java.lang.ClassCastException:
>> org.apache.hadoop.io.IntWritable cannot be cast to org.apache.hadoop.io.Text
>>         at
>> org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderNextException(HiveIOExceptionHandlerChain.java:121)
>>         at
>> org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderNextException(HiveIOExceptionHandlerUtil.java:77)
>>         at
>> org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.doNext(HiveContextAwareRecordReader.java:344)
>>         at
>> org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.doNext(CombineHiveRecordReader.java:101)
>>         at
>> org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.doNext(CombineHiveRecordReader.java:41)
>>         at
>> org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.next(HiveContextAwareRecordReader.java:122)
>>         at
>> org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.doNextWithExceptionHandler(HadoopShimsSecure.java:254)
>>         ... 11 more
>> Caused by: java.lang.ClassCastException: org.apache.hadoop.io.IntWritable
>> cannot be cast to org.apache.hadoop.io.Text
>>         at
>> org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl$StringDictionaryTreeReader.next(RecordReaderImpl.java:1592)
>>         at
>> org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl$StringTreeReader.next(RecordReaderImpl.java:1346)
>>         at
>> org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl$StructTreeReader.next(RecordReaderImpl.java:1788)
>>         at
>> org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.next(RecordReaderImpl.java:2997)
>>         at
>> org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$OrcRecordReader.next(OrcInputFormat.java:153)
>>         at
>> org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$OrcRecordReader.next(OrcInputFormat.java:127)
>>         at
>> org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.doNext(HiveContextAwareRecordReader.java:339)
>>         ... 15 more
>>
>> Container killed by the ApplicationMaster.
>> Container killed on request. Exit code is 143
>> Container exited with a non-zero exit code 143
>>
>>
>> FAILED: Execution Error, return code 2 from
>> org.apache.hadoop.hive.ql.exec.mr.MapRedTask
>> MapReduce Jobs Launched:
>> Job 0: Map: 1544   Cumulative CPU: 250716.7 sec   HDFS Read: 160494367819
>> HDFS Write: 121322793549 FAIL
>> Total MapReduce CPU Time Spent: 2 days 21 hours 38 minutes 36 seconds 700
>> msec
>>
>>
>> Greatly appreciate any help or information on this and how to rectify it.
>>
>>
>> --
>> Thanks ,
>> Vivek Veeramani
>>
>>
>> cell : +91-9632 975 975
>>         +91-9895 277 101
>>
>
>


-- 
Thanks ,
Vivek Veeramani


cell : +91-9632 975 975
        +91-9895 277 101

Reply via email to