Chris Hunter wrote:
Hi,

I am running bacula 1.38.4 on Scientifc Linux, a rhel4 clone. I am using sqlite3 for my database. My catalog has roughly 200K files; I backup approx 500GB weekly.

Question #1: is 200,000 files too big a catalog for sqlite ?

I ask because, the one restore job I have done was nearly 7 hours to index the catalog but only 15min to read from tape (approx 10GB/300 files).

Related question. I notice that the backup jobs are multi-threaded; according to top, backups use both cpus on my smp machine. However the
restore job used only one thread.

Question #2: Are restores multi-threaded ?


to #1:
i would suggest to use postgresql for any productive use.
it's not much hazzle to set up and a lot faster, also sqllite might run into a database size limit

florian


-------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to