aha! All's well that ends well then! :)
On Thu, Jun 6, 2013 at 9:49 AM, Sachin Sudarshana
wrote:
> Hi Stephen,
>
> Thank you for your reply.
>
> But, its the silliest error from my side. Its a typo!
>
> The codec is : org.apache.hadoop.io.compress.*GzipCodec* and not
> org.apache.hadoop.io.com
Hi Stephen,
Thank you for your reply.
But, its the silliest error from my side. Its a typo!
The codec is : org.apache.hadoop.io.compress.*GzipCodec* and not
org.apache.hadoop.io.compress.*GZipCodec.*
*
*
I regret making that mistake.
Thank you,
Sachin
On Thu, Jun 6, 2013 at 10:07 PM, Stephen
Hi Sachin,
LIke you say looks like something to do with the GZipCodec all right. And
that would make sense given your original problem.
Yeah, one would think it'd be in there by default but for whatever reason
its not finding it but at least the problem is now identified.
Now _my guess_ is that m
Hi Stephen,
*hive> show create table facts520_normal_text;*
*OK*
*CREATE TABLE facts520_normal_text(*
* fact_key bigint,*
* products_key int,*
* retailers_key int,*
* suppliers_key int,*
* time_key int,*
* units int)*
*ROW FORMAT DELIMITED*
* FIELDS TERMINATED BY ','*
* LINES TERMINATED B
well... the hiveException has the word "metadata" in it. maybe that's a
hint or a red-herrring. :)Let's try the following:
1. show create table * facts520_normal_text;
*
*2. anything useful at this URL? **
http://aana1.ird.com:50030/taskdetails.jsp?jobid=job_201306051948_0010&tipid=task_
Hi,
I have hive 0.10 + (CDH 4.2.1 patches) installed on my cluster.
I have a table facts520_normal_text stored as a textfile. I'm trying to
create a compressed table from this table using GZip codec.
*hive> SET hive.exec.compress.output=true;*
*hive> SET
mapred.output.compression.codec=org.apach