Maarten Hoogveld wrote: > Hello list, > > Does anyone have an idea how to solve this efficiently or have any > experience with this type of problem?
Yes, with exactly this type of problem. I have 200GB of mostly-static data spread fairly evenly over two sites. The problem isn't just backup - how do you propose to restore 20GB? My solution is to break the backup into two parts: Part 1: Sync. Use rsync to get a copy of the data from the server which needs backing up to some location closer to your storage daemon. Works beautifully with Linux, rsync even supports ACLs. You may need to do some fiddling to make it work with Windows. Provided it's data rather than operating system that needs the backup, I can't see why it can't work. Driven by a pre-backup script. The first time this is done, your backup takes a while. Subsequent syncs, OTOH, are much less painful. Part 2: Spool to storage (be it tape, disk etc). Because you're going over the LAN here, you get LAN performance. Restore: May be just as easy to pull the data from bacula, put it on a hard disk locally and courier it to the remote site. -- James Cort IT Manager U4EA Technologies Ltd. -- U4EA Technologies http://www.u4eatech.com ------------------------------------------------------------------------------ _______________________________________________ Bacula-users mailing list Bacula-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/bacula-users