Farah – can you configure the remote server as a client machine? You would
just need to install Hadoop with a configuration pointing to your cluster,
and then install Hive. You'd then be able to execute all Hive commands
against your cluster. Note that you won't run any daemons on this node, so
yo
Hi Omar
You'd definitely need to copy the file into hdfs from your remote server.
Even if you use LOAD DATA LOCAL ... the data is copied to hdfs . Hive would
trigger map reduce jobs for hive QL and for that data is needed in hdfs. The
better approach would be
-copy the data into hdfs
-use i
Hi Farah, If the data are on another server, you still need to move it one
way or another. A bare bone way to do this is to use `hadoop fs -put ...`
command after which you can create external or managed table in Hive. If
the data are in a relational DB you can use sqoop. You can also look into