Farah – can you configure the remote server as a client machine? You would just need to install Hadoop with a configuration pointing to your cluster, and then install Hive. You'd then be able to execute all Hive commands against your cluster. Note that you won't run any daemons on this node, so you want to make sure that none of the Hadoop processes are getting started.
Jonathan On Thu, Mar 1, 2012 at 10:20 AM, Omer, Farah <fo...@microstrategy.com>wrote: > Hello, > > Could anybody tell me how can I load data into a Hive table when the flat > file is existing on another server and bit locally on Hadoop node. > > For example, I am trying to load the table LU_CUSTOMER, and the flat file > for this table exists on some other RH linux server: 10.11.12.13. The > LU_CUSTOMER flat file is about 30 GB in size, hence if I move it locally to > the Hadoop node, that will take a long time. I am trying to avoid this > loading onto Hadoop node part. > So I wonder if there is a way to load the table directly from the other > server. > > The syntax that I know currently is: LOAD DATA LOCAL INPATH > '/home/nzdata/CLOUD/SCRIPT/LU_CUSTOMER.txt' OVERWRITE INTO TABLE > LU_CUSTOMER; > > But in case I want to load from the other server directly, the path won’t > be local. > > Any suggestions? Is that possible…. > > Thanks. > > *Farah Omer* > > *Senior DB Engineer, MicroStrategy, Inc.* > T: 703 2702230 > E: *fo...@microstrategy.com* <fo...@microstrategy.com> > *http://www.microstrategy.com* <http://www.microstrategy.com> > > > >