Rich escribió:
> On 2007.08.23. 14:51, Angel Mieres wrote:
>
>> I will try to answer all things that you said friends.
>> - I don't use compression method to do a backup.
>> - The database is postgres and during backup isn't taking all CPU or
>> memory, only a 10% CPU. Process like bacula-fd/sd/dir exactly the same.
>>
>
> what about io-based database load ?
> how high are iowait values during the backup ?
>
This is the output of an iostat -k 2 when running backup of small files:
avg-cpu: %user %nice %sys %iowait %idle
4.51 0.00 1.63 22.90 70.96
Device: tps kB_read/s kB_wrtn/s kB_read kB_wrtn
fd0 0.00 0.00 0.00 0 0
sda 248.74 478.39 4168.84 952 8296
And this is the output when large files are backing up:
avg-cpu: %user %nice %sys %iowait %idle
4.37 0.00 9.99 38.08 47.57
Device: tps kB_read/s kB_wrtn/s kB_read kB_wrtn
fd0 0.00 0.00 0.00
0 0
sda 288.00 29820.00 272.00 59640 544
>> - Im not backing up to a tape, the two backups are from:
>> - /usr to a volume file stored on /tmp (~70000 files, 2Gb)
>> - /root to a volume file stored on /tmp ( ~100 files, 1,5Gb)
>> - The way to "tar" or "gzip" the files is not possible, i need stimate
>> to need a backup from 15 diferent servers and this could be crazy.
>>
>> Thanks to all.
>>
> ...
>
-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Bacula-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/bacula-users