Manish, Just specify the directory under location and not the actual file. In your case (notice, there is no .csv in the location). If your field definitions match the data already present in HDFS, this should work just fine. Make sure all the files under directory input have the same structure.
CREATE external TABLE tbl_call_lead( ... .... ... ... ..) COMMENT 'data lead table' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\054' STORED AS TEXTFILE LOCATION '/usr/manish/input'; Good luck On Thu, Aug 16, 2012 at 7:52 AM, Manish <manishbh...@rocketmail.com> wrote: > > Hello All, > > I am copying data from local drive to the HDFS and creating an external > table in hive. But due to some reason data is not copied and create > table script giving an error "file/folder doesn't exists". > > Note: there is an error also "log4j", don't know what exact reason for > this error. In past though same error was coming but data was getting > copied. > > manish@manish:~$ hadoop fs > -copyFromLocal /home/manish/airteldata/datadump.txt > hdfs://localhost:54310/usr/manish/input/data.csv > log4j:ERROR Could not find value for key log4j.appender.NullAppender > log4j:ERROR Could not instantiate appender named "NullAppender". > > CREATE external TABLE tbl_call_lead( > ... > .... > ... > ... > ..) > COMMENT 'data lead table' > ROW FORMAT DELIMITED FIELDS TERMINATED BY '\054' > STORED AS TEXTFILE > LOCATION '/usr/manish/input/data.csv'; > >