I am not familiar with https://github.com/mayanhui/hive-orc-mr/. But
is there any reason why you are not using hcatalog input/output format
for this ?  
https://cwiki.apache.org/confluence/display/Hive/HCatalog+InputOutput.

On Wed, Apr 30, 2014 at 4:25 AM, Seema Datar <sda...@yahoo-inc.com> wrote:
> Hi All,
>
> Does anybody have ideas to solve this issue?
>
> Thanks,
> Seema
>
> From: Seema Datar <sda...@yahoo-inc.com>
> Date: Tuesday, April 29, 2014 at 11:10 PM
>
> To: "user@hive.apache.org" <user@hive.apache.org>
> Subject: Re: OrcOutputFormat
>
> Hi Abhishek,
>
> I was referring to the link below and was trying to do something similar.
>
> https://github.com/mayanhui/hive-orc-mr/
>
> This package does not seem to use Hcatalog.
>
> Thanks,
> Seema
>
>
> From: Abhishek Girish <agir...@ncsu.edu>
> Reply-To: "user@hive.apache.org" <user@hive.apache.org>
> Date: Tuesday, April 29, 2014 at 10:38 PM
> To: "user@hive.apache.org" <user@hive.apache.org>
> Subject: Re: OrcOutputFormat
>
> Hi,
>
> AFAIK, you would need to use HCatalog APIs to read-from/write-to an ORCFile.
> Please refer to
> https://cwiki.apache.org/confluence/display/Hive/HCatalog+InputOutput
>
> -Abhishek
>
>
>
> On Tue, Apr 29, 2014 at 6:40 AM, Seema Datar <sda...@yahoo-inc.com> wrote:
>>
>> Hi,
>>
>> I am trying to run an MR job to write files in ORC format.  I do not see
>> any files created although the job runs successfully. If I change the output
>> format from OrcOutputFormat to TextOutputFormat (and that being the only
>> change), I see the output files getting created. I am using Hive-0.12.0. I
>> tried upgrading to Hive 0.13.0 but with this version I get the following
>> error -
>>
>> 2014-04-29 10:37:07,426 FATAL [main] org.apache.hadoop.mapred.YarnChild:
>> Error running child : java.lang.VerifyError:
>> org/apache/hadoop/hive/ql/io/orc/OrcProto$RowIndex
>>      at
>> org.apache.hadoop.hive.ql.io.orc.WriterImpl.<init>(WriterImpl.java:129)
>>      at
>> org.apache.hadoop.hive.ql.io.orc.OrcFile.createWriter(OrcFile.java:369)
>>      at
>> org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat$OrcRecordWriter.close(OrcOutputFormat.java:104)
>>      at
>> org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat$OrcRecordWriter.close(OrcOutputFormat.java:91)
>>      at
>> org.apache.hadoop.mapred.MapTask$DirectMapOutputCollector.close(MapTask.java:784)
>>      at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:411)
>>      at org.apache.hadoop.mapred.MapTask.run(MapTask.java:335)
>>      at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:158)
>>      at java.security.AccessController.doPrivileged(Native Method)
>>      at javax.security.auth.Subject.doAs(Subject.java:415)
>>      at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1300)
>>      at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:153)
>>
>> How do you think can this issue be resolved?
>>
>>
>> Thanks,
>>
>> Seema
>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Reply via email to