On Mon, May 01, 2006 at 10:43:11PM +0100, Alan Brown wrote:
> On Mon, 1 May 2006, John Kodis wrote:
> 
> >You have more CPU than I do, but only about half the memory.  Since
> >you say that the disks are thrashing, I'd guess that the lack of
> >memory is more likely to be the culprit than the difference in
> >databases.  There have, however, been some messages posted here
> >indicating that MySQL needs more attention to tuning when backing up
> >this many files than a similar Postgres installation does.
> 
> By way of comparison my systems were thrashing badly even with 2Gb of ram 
> because mySQL wasn't using enough memory. Allowing it to grow to 1Gb ram 
> solved it almost entirely.

I just have 512 MB of RAM but you made me think about my server parameters.

> Adjusting /etc/my.cnf is pretty much a necessity on large database 
> systems, or mysql will make extensive use of temporary files and things 
> can grind to a halt quickly. The default settings are only ok for 
> small systems or for testing.

While I used settings from the "large" my.cnf example configuration file
and had a slight speedup it still took longer than a few hours so I
interrupted.

Following to hints from the helpful inhabitants of #bacula on
irc.freenode.net I ran a "dbcheck" searching for orphaned rows. Voila, my
row count decreased from 21 million to 3 million. I even needed two runs
because dbcheck seemed to have an upper limit of 10 million entries to
delete. :)

I have no idea why my catalog grew like this. I have been restoring it once
with bscan because the catalog was lost and I had to recover the
information from 20 DDS-3 tapes (not the most interesting tale of my life).

I still don't have an idea why the restore took that long on my main
server. The very same catalog was imported during an hour on a workstation.
Pretty strange.

What's also still a bit weird: I have 450,000 distinct files in my backup.
Still the `File` table counts 3 million rows. I was told that the number of
rows there is independent from the number of times a certain file was
backed up. So it's not like 450,000 files x 10 backups = 4,500,000 million
rows.

Kindly
 Christoph


-------------------------------------------------------
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=120709&bid=263057&dat=121642
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to