---------------------------------------- > Date: Wed, 25 May 2011 12:17:07 +0200 > From: s...@elego.de > To: d...@daniel.shahaf.name > CC: smith_winston_6...@hotmail.com; dev@subversion.apache.org > Subject: Re: large number of large binary files in subversion > > On Wed, May 25, 2011 at 01:02:28PM +0300, Daniel Shahaf wrote: > > Stefan Sperling wrote on Wed, May 25, 2011 at 11:45:21 +0200: > > > On Wed, May 25, 2011 at 10:03:16AM +1100, Winston Smith wrote: > > > > Yes, I planned to do that for a read-only backup repository as part of > > > > various backup schedules (daily, weekly, monthly, yearly). > > > > > > Unfortunately there is no incremental hotcopy support yet, > > > see http://subversion.tigris.org/issues/show_bug.cgi?id=3815
That won't be necessary as per design. A hotcopy is always meant to be used in a full backup. SVN uses delta-storage, so it *is* effectivly doing an incremental backup already. All one has to do is to consolidate these incremental backups into one full backup. Something one would have to do anyway when recovering if SVN didn't do it. > > > If you ensure that no commits happen during the backup period you > > > could use rsync instead. > > > > It is not safe to rsync live Subversion filesystems. (the result may or > > may not be corrupt) > > That's why I said that no commits should happen. But thanks for > spelling it out more explicitly. I recently heard of someone who did an rsync of live postgresql databases as part of his backup schedule, and was wondering why the result is corrupted... I believe an SVN repo can be put into read-only mode quickly by linking the pre_commit hook to /bin/false (linking is believed to be an atomic operation, so no race conditions) and wait a bit to let all commits finish. Read operations are guaranteed to not alter the repo files. rsyncing should then be ok, but I still prefer to hotcopy since this is the canonical way to go... I like canonical and dedicated ways to do things. Rsync seems to be a silver bullet here... - Winston