thanks for your reply.... i will try and get back to you
thanks,
Venkat
On Friday 25 January 2013 06:37 PM, bejoy...@yahoo.com wrote:
Hi Venkataraman
You can just create an external table and give it location as the hdfs
dir where the data resides.
No need to perform an explicit LOAD operation here.
Regards
Bejoy KS
Sent from remote device, Please excuse typos
------------------------------------------------------------------------
*From: * venkatramanan <venkatraman...@smartek21.com>
*Date: *Fri, 25 Jan 2013 18:30:29 +0530
*To: *<user@hive.apache.org>
*ReplyTo: * user@hive.apache.org
*Subject: *LOAD HDFS into Hive
Hi,
I need to load the hdfs data into the Hive table.
For example,
Am having the twitter data and its updated daily using the streaming
API. These twitter responses are stored into the HDFS Path named like
('TwitterData'). After that i try to load the data into the Hive.
using the 'LOAD DATA stmt'. My problem is that hdfs data is lost once
i load the data. is there any way to load the data without the hdfs
data lose.
To Create the Table using the below stmt;
CREATE EXTERNAL TABLE Tweets (FromUserId String, Text string,
FromUserIdString String, FromUser String, Geo String, Id BIGINT,
IsoLangCode string, ToUserId INT, ToUserIdString string, CreatedAt
string) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t' LINES
TERMINATED BY '\n';
To LOAD the data using the below stmt;
LOAD DATA INPATH '/twitter_sample' INTO TABLE tweets;
thanks in advance
Thanks,
Venkat
--