Hi,

I have an issue when trying a restore of a large data set consisting of about 
6.5TB and 35,000,000 files the console takes an extremely long time to build 
the directory tree, over an hr. After the tree is built I typed "mark *" and 
this command ran for about 18 hours before I hit ctrl-c; the mySQL server 
showed no activity and neither bconsole nor bacula-dir was not using any CPU, 
so I believe the process just petered out. I was wondering if anyone has been 
successful in restoring large TB data sets. Either ones containing a large 
number of individual files or ones that are 6TB or larger with small file lists 
and contain small number of large files? 

A little more system info;

bacula-dir Version: 5.0.3 (director and storage reside on separate systems)
OS FreeBSD 8.2-RELEASE #0

Thanks, in advance for any shared results!



------------------------------------------------------------------------------
All the data continuously generated in your IT infrastructure contains a 
definitive record of customers, application performance, security 
threats, fraudulent activity and more. Splunk takes this data and makes 
sense of it. Business sense. IT sense. Common sense.. 
http://p.sf.net/sfu/splunk-d2d-c1
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to