Hello all, I have the following robustness-related question:
I am deploying bacula as a multi-site backup soultion, where a central tape storage gets relatively large quantities of data (aprox 20 GBytes per day) from multiple sites via net links of dubious reliability. I am considering two possible options of getting the data to the storage server: 1) Install a File Daemon on each server to be backed up and let it pump the data over the net to the storage server 2) Get the director (running on the same machine as the storage server) run a script that "tar"s the data over the network on a local temp directory and -- if the connection didn't break in the mean time -- dump the result to tape. My question is: While in alternative 2) I can check for connection losses in the middle of the transfer and restart the data transfer, I have no idea how the File Daemon behaves in the same situation: Does it save any state locally such that it makes it robust to connectivity losses ? Can it resume an interrupted job ? How resilient is the storage server to such interruption in the data stream from the client ? Can I compress the data as it is sent from the File Server to the Storage server ? TIA for any hints and pointers, Florian ------------------------------------------------------- This SF.net email is sponsored by: Splunk Inc. Do you grep through log files for problems? Stop! Download the new AJAX search engine that makes searching your log files as easy as surfing the web. DOWNLOAD SPLUNK! http://ads.osdn.com/?ad_idv37&alloc_id865&op=click _______________________________________________ Bacula-users mailing list Bacula-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/bacula-users