On Tue, Nov 06, 2012 at 04:15:18PM +0100, lst_ho...@kwsoft.de wrote: > > from my knowledge the jobs are created but take on hold as long as > other jobs with different priority are running. So it looks like the > catalog connections are established as part of the job creation. That > said you should not need to stop the catalog database for the dump, in > fact i wonder that it works at all to backup the catalog this way. If > you are in doubt about taking online dumps you can also use PostgreSQL > which might be a good choice at that scale anyway ;-) > > Regards > > Andreas
Hello Andreas, thanks much for your reply. We've found that bacula will reconnect to the DB fine if you restart it from within a runbefore script, and currently this seems the only option to get catalog backup speed back to reasonable levels (dumping the db through a gzip --fast would take five hours or so while an rsync takes only roughly 30 minutes, including lzop compression of the resulting file). I don't want to start another holy DB war, but I was wondering how others (using mysql) handle catalog backups with a catalog size exceeding 170GB. All the best, Uwe ------------------------------------------------------------------------------ LogMeIn Central: Instant, anywhere, Remote PC access and management. Stay in control, update software, and manage PCs from one command center Diagnose problems and improve visibility into emerging IT issues Automate, monitor and manage. Do more in less time with Central http://p.sf.net/sfu/logmein12331_d2d _______________________________________________ Bacula-users mailing list Bacula-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/bacula-users