Hi all,
I have 3 tables in Mysql and i want to club the data of 3 tables in 1
table of hive.(create a data warehouse).I created table with all the
columns of 3 tables but i am unable to push data in table of
hive.After running an import statement of sqoop i pulled all the
records in hdfs but At a
Hi Bejoy,
Thanks for the reply.Got it.
Regards
Abhishek
On Tue, Jul 31, 2012 at 12:13 PM, Bejoy KS wrote:
> Hi Abshiek
>
> To get the cause of this error, you need to look at the failed mapreduce task
> logs.
>
> Regards
> Bejoy KS
>
> Sent from handheld, please excuse typos.
>
> -Original
Hi Abshiek
To get the cause of this error, you need to look at the failed mapreduce task
logs.
Regards
Bejoy KS
Sent from handheld, please excuse typos.
-Original Message-
From: abhiTowson cal
Date: Tue, 31 Jul 2012 12:08:33
To:
Reply-To: user@hive.apache.org
Subject: ERROR
hi all,
We do a similar process with our log files in Hive. We only handle 30 to 60
files (similar structure) at a time, but it sounds like it would fit your
model…..
We create an external table, then do hdfs puts to add the files to the table:
CREATE EXTERNAL TABLE log_import(
date STRING,
time ST
If you can create table having schema similar to your files' structure. and
later add files as partition into the table-
https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL#LanguageManualDDL-AlterTable%2FPartitionStatements
then you can query your files using where clause.
This s