I will try to answer all things that you said friends. - I don't use compression method to do a backup. - The database is postgres and during backup isn't taking all CPU or memory, only a 10% CPU. Process like bacula-fd/sd/dir exactly the same. - Im not backing up to a tape, the two backups are from: - /usr to a volume file stored on /tmp (~70000 files, 2Gb) - /root to a volume file stored on /tmp ( ~100 files, 1,5Gb) - The way to "tar" or "gzip" the files is not possible, i need stimate to need a backup from 15 diferent servers and this could be crazy.
Thanks to all. ------------------------------------------------------------------------- This SF.net email is sponsored by: Splunk Inc. Still grepping through log files to find problems? Stop. Now Search log events and configuration files using AJAX and a browser. Download your FREE copy of Splunk now >> http://get.splunk.com/ _______________________________________________ Bacula-users mailing list Bacula-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/bacula-users