Howdy! I'm still thinking if it would be possible to use bacula for backing up xxx TB of data, instead of a more expensive solution with LAN-less backups and snapshots.
Problem is the time window and bandwith. If there would be something like a incremental forever feature in bacula the problem could be solved. I know the Accurate Backup feature but without deduplication (don't backup/store the same file more than once) it's possible that the space needed for backups will grow (user moves or renames a directory with 10 TB of data...) Accurate Backup will detect this and back up the data again (instead of just pointing to the new location inside the database). The new Base Job feature doesn't seem to help too. It's a begin, but I don't see how it could help here. It's more for large amounts of clients with the same files at the same place. What I need is a delta copy with file deduplication. Am I missing something or is this just not possible right now? Anyone backing up 100..200... TB data? (we back up some server with 10-15 TB filesets, but a full backup takes nearly a week with verify). Ralf ------------------------------------------------------------------------------ Download Intel® Parallel Studio Eval Try the new software tools for yourself. Speed compiling, find bugs proactively, and fine-tune applications for parallel performance. See why Intel Parallel Studio got high marks during beta. http://p.sf.net/sfu/intel-sw-dev _______________________________________________ Bacula-users mailing list Bacula-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/bacula-users