Re: external partitioned table

2012-02-08 Thread Roberto Congiu
Hi Koert, we have a similar situation and this is what we did. In our case, the partitions correspond to dates. We also have multiple external tables set up this way. The upstream process updates a status file with the earliest and latest date available. I scan the DFS for new partitions (scan prog

Re: external partitioned table

2012-02-08 Thread Mark Grover
Hi Koert, That's because Hive metastore doesn't know about the partitions you added. I was in a similar situation but I use Amazon EMR and in their version of Hive, one can run the command "alter table src recover partitions" that goes through the directory structure of table (src, in this case)

Re: external partitioned table

2012-02-08 Thread bejoy_ks
Hi Koert As you are creating dir/sub dirs using mapreduce jobs out of hive, hive is unaware of these sub dirs. There is no other way in such cases other than an add partition DDL to register the dir with a hive partition. If you are using oozie or shell to trigger your jobs,you can accom