e my_table
position=22
and it's waiting there at SemanticAnalyzer...
Sincerely,
Ameet
From: Bejoy Ks
To: "user@hive.apache.org" ; Edward Capriolo
; ameet chaubal
Sent: Monday, January 16, 2012 10:24 AM
Subject: Re: large sql file creating large nu
rom: ameet chaubal
To: Edward Capriolo ; "user@hive.apache.org"
Sent: Monday, January 16, 2012 8:44 PM
Subject: Re: large sql file creating large num of columns
thanks,
this is an external table; so at the DDL stage, there is no data loading that
is happening. All that hive is suppo
rnal table" does not need the data to be
present, right?
Sincerely,
Ameet
From: Edward Capriolo
To: user@hive.apache.org; ameet chaubal
Sent: Monday, January 16, 2012 10:06 AM
Subject: Re: large sql file creating large num of columns
I highly doubt th
I highly doubt this will work. I think that many things in hadoop and hive
will try to buffer an entire row so even if you make it past the metastore
I do not think it will be of any use.
On Mon, Jan 16, 2012 at 9:42 AM, ameet chaubal wrote:
> Hi All,
>
> I have a SQL file of size 30mb which is a
Hi All,
I have a SQL file of size 30mb which is a single create table statement with
about 800,000 columns, hence the size.
I am trying to execute it using hive -f . Initially, hive ran the command
with 256mb heap size and gave me an OOM error. I increased the heap size using
export HADOOP_HE