Thanks for clarifying that Bryan

On Thu, Mar 6, 2014 at 7:55 PM, Bryan Jeffrey <bryan.jeff...@gmail.com>wrote:

> Nitin,
>
> #3 will not work.  msck repair table does not remove partitions if the
> files associated with the partition do not exist.  We have successfully
> applied #2 in our application.
>
> Regards,
>
> Bryan Jeffrey
>
>
> On Thu, Mar 6, 2014 at 5:37 AM, Nitin Pawar <nitinpawar...@gmail.com>wrote:
>
>> There is no concept called automatic.
>>
>> Please wait for expert hive gurus to reply before using any of my
>> suggestions
>>
>> Few options which I can think of are
>> 1) Insert overwrite table with dynamic partitions enabled and restricting
>> the partition column values for the date range you want. Cost of this
>> operation will totally matter on how big the table is when you are
>> importing via sqoop.
>>
>> 2) Load data in new partition and drop older partition using hive script
>> > little bit of scripting effort is needed
>> 3) Use hadoop command line utilities to clear partition directories from
>> hdfs and then do a table repair.  I never heard anyone using this to delete
>> partition. Its mostly to recover lost partitions etc
>>
>>
>>
>>
>>
>>
>> On Thu, Mar 6, 2014 at 3:53 PM, Kasi Subrahmanyam <kasisubbu...@gmail.com
>> > wrote:
>>
>>> Hi,
>>> I have a table in hive which has data of three months old. I have
>>> partitioned the data and I got 90 partitions. Now when I get the new data
>>> from next day I want to replace the partition 1week old with the new one
>>> automatically.
>>>
>>> Can this partitioning and replacement be done using swoop at the same
>>> time
>>>
>>> Thanks,
>>> Subbu
>>>
>>
>>
>>
>> --
>> Nitin Pawar
>>
>
>


-- 
Nitin Pawar

Reply via email to