19.05.2021,  07:36  Gary R. Schmidt:
On 19/05/2021 14:48, fk+bacula--- via Bacula-users wrote:
Hi there,

I do running a daily incremental backup with a file set of multiple directories. The daily incremental backup time fluctuates between 4 and 16 hours, where are 3 between 10 GB of data collected.

For optimizing the process it would be nice to find out, how many time the FD has taken for each directory and how many files and size are send to the SD.

An useful hints to get monitored this?

Thanks for any help, Frank

Fiddling with the code in the file daemon for this probably won't happen unless there's a massive outcry for it, but you can get the list of files by running the equivalent file search on the system at the same time.

Something like "find / -mtime -1 -o -ctime -1" would be a starting point.

Asking the people who are generating the data, "Why is it so?", might also be helpful.

Thanks, nice idea. I will also try to find out, if I get my answer by looking into the catalog database. There it should be possible to see the files of the last inc backup job and then filter and count, grouped by the directory.

Frank



_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to