I have a particular host/job that has roughly 1.2 million files @ 
18GB's... that's one full backup.

When running a restore, it took almost 28 hours for bacula to finish the 
"building directory for jobid blah" and using 100% of one cpu ... once 
this is done the actual file selction and restore operations are quite 
fast and normal. Note I was using the #5 option "most recent full backup"...

My question is how can this initial building of the directory tree be 
sped up? A million files doesn't seem like alot (I've seen folks post in 
the archives here of 6million file jobs)....

Could it be the binary routines mentioned here causing the slowness? 
http://www.nabble.com/Re%3A-Rocket-science%3A-was-Re%3A-Bacula-version-1.38.10-released-p4882496.html


bacula-dir (1.38.11) is running on a dual AMD Opteron with 2G of ram. 
Not using tapes, volumes are directly on attached storage array. OS is 
Suse 9.3 Pro x86_64


Thanks,

Jeremy

Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=120709&bid=263057&dat=121642
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to