Hi Omar
     You'd definitely need to copy the file into hdfs from your remote server. 
Even if you use LOAD DATA LOCAL ... the data is copied to hdfs . Hive would 
trigger map reduce jobs for hive QL and for that data is needed in hdfs. The 
better approach would be
-copy the data into hdfs
-use it with hive

Regards
Bejoy.KS


________________________________
 From: "Omer, Farah" <fo...@microstrategy.com>
To: "hive-u...@hadoop.apache.org" <hive-u...@hadoop.apache.org> 
Sent: Thursday, March 1, 2012 9:50 PM
Subject: How to load a table from external server....
 

 
Hello,
 
Could anybody tell me how can I load data into a Hive table when the flat file 
is existing on another server and bit locally on Hadoop node.
 
For example, I am trying to load the table LU_CUSTOMER, and the flat file for 
this table exists on some other RH linux server: 10.11.12.13. The LU_CUSTOMER 
flat file is about 30 GB in size, hence if I move it locally to the Hadoop 
node, that will take
a long time. I am trying to avoid this loading onto Hadoop node part.
So I wonder if there is a way to load the table directly from the other server.
 
The syntax that I know currently is: LOAD DATA LOCAL INPATH 
'/home/nzdata/CLOUD/SCRIPT/LU_CUSTOMER.txt' OVERWRITE INTO TABLE LU_CUSTOMER;
 
But in case I want to load from the other server directly, the path won’t be 
local. 
 
Any suggestions? Is that possible….
 
Thanks.
 
Farah Omer
 
Senior DB Engineer, MicroStrategy, Inc. 
T:703 2702230
E:fo...@microstrategy.com
http://www.microstrategy.com

Reply via email to