I have a hive set of of tables. Quite a large number of them are not
external tables but "internalized hive tables". I have another hadoop
instance up and running. Is there a way i can migrate the hive data from
one instance to another, and then create the hive tables on the new
instance? Can t
#x27;t works for me-
>>
>> *describe formatted table1*
>>
>> I am running Hive 0.6. Anything else I need to do before running this
>> query?
>>
>>
>>
>>
>> On Thu, Aug 2, 2012 at 12:13 PM, Igor Tatarinov wrote:
>>
>>> Try
>>&g
ah, works. thanks.
On Thu, Aug 2, 2012 at 3:13 PM, Igor Tatarinov wrote:
> Try
> describe formatted
>
> igor
> decide.com
>
> On Thu, Aug 2, 2012 at 12:04 PM, Anson Abraham wrote:
>
>> is there a way to derive information (schema) off a hive table?
>>
is there a way to derive information (schema) off a hive table?
doing a describe only shows the clolumn w/ types.
But i want to know if a table is like this:
ROW FORMAT DELIMITED
FIELDS TERMINATED BY '\t'
COLLECTION ITEMS TERMINATED BY '\002'
MAP KEYS TERMINATED BY '\003'
ST
With the release of cdh4, does is Lzo compression still supported, where if
I have my hive table point to path of files in lzo?
-anson
In Hive, I'm having issues doing an insert overwrite table to a table that
is in avro format.
So my existing table (table1) is read from a hdfs directory where the files
are in avro format.
I created another table table2 in avro format (which is identical in
columns, etc...):
CREATE EXTERNAL TAB
our new field
> will be NULL. Is it not what you observe?
>
> Thanks,
> Aniket
>
>
> On Thu, Mar 1, 2012 at 12:06 PM, Anson Abraham wrote:
>
>> If i have a hive table, which is an external table, and have my "log
>> files" being read into it, if a new file
If i have a hive table, which is an external table, and have my "log files"
being read into it, if a new file is imported into the hdfs and the file
has a new column, how can i get hive to handle the old files w/o the new
column, if I do an alter adding column into the hive table.
So example, i hav