I'm trying to create a series of external tables for a time series of data 
(using the prebuilt Cloudera VM).

The directory structure in HDFS is as such:

/711
/712
/713
/714
/715
/716
/717

Each directory contains the same set of files, from a different day. They were 
all put into HDFS using the following script:

for i in *;do hdfs dfs -put $i in $dir;done

They all show up with the same ownership/perms in HDFS.

Going into Hive to build the tables, I built a set of scripts to do the 
loads--then did a sed (changing 711 to 712,713, etc) to a file for each day. 
All of my loads work, EXCEPT for 715 and 716. 

Script is as follows:

create external table 715_table_name
(col1 string,
col2 string)
row format
delimited fields terminated by ','
lines terminated by '\n'
stored as textfile
location '/715/file.csv';

This is failing with:

Error in Metadata MetaException(message:Got except: 
org.apache.hadoop.fs.FileAlreadyExistsException Parent Path is not a directory: 
/715 715...

Like I mentioned it works for all of the other directories, except 715 and 716. 
Thoughts on troubleshooting path?

Thanks

Joey D'Antoni

Reply via email to