This is solved. Used Writable instead of LongWritable or NullWritable in
Mapper input key type.
Thanks
Suraj Nayak
On 19-Mar-2015 9:48 PM, "Suraj Nayak" wrote:
> Is this related to https://issues.apache.org/jira/browse/HIVE-4329 ? Is
> there a workaround?
>
> On Thu, Mar 19, 2015 at 9:47 PM, Sur
Is this related to https://issues.apache.org/jira/browse/HIVE-4329 ? Is
there a workaround?
On Thu, Mar 19, 2015 at 9:47 PM, Suraj Nayak wrote:
> Hi All,
>
> I was successfully able to integrate HCatMultipleInputs with the patch for
> the tables created with TEXTFILE. But I get error when I read
Hi All,
I was successfully able to integrate HCatMultipleInputs with the patch for
the tables created with TEXTFILE. But I get error when I read table created
with ORC file. The error is below :
15/03/19 10:51:32 INFO mapreduce.Job: Task Id :
attempt_1425012118520_9756_m_00_0, Status : FAILED
Hi All,
https://issues.apache.org/jira/browse/HIVE-4997 patch helped!
On Tue, Mar 17, 2015 at 1:05 AM, Suraj Nayak wrote:
> Hi,
>
> I tried reading data via HCatalog for 1 Hive table in MapReduce using
> something similar to
> https://cwiki.apache.org/confluence/display/Hive/HCatalog+InputOutpu
Hi,
I tried reading data via HCatalog for 1 Hive table in MapReduce using
something similar to
https://cwiki.apache.org/confluence/display/Hive/HCatalog+InputOutput#HCatalogInputOutput-RunningMapReducewithHCatalog.
I was able to read successfully.
Now am trying to read 2 tables, as the requiremen