Due to a typo in my configuration, I have one 750G volume! (arrgh!) I forgot to add
Maximum Volume Bytes = 50G # Limit Volume size to something reasonable Maximum Volumes = 50 # Limit number of Volumes in Pool Now, that I've added these extra directives - I assuming that the current mega volume will be closed with an EOF, and a new volume created, which will max out in size @ 50G. Problem I currently have is, I think it'll be a good idea to move jobs out of this massive file, I'm probably a bit paranoid, but I've debated hours over volume sizes! We almost made it standard to 4G volumes, so that they can be copied onto DVD if need be. Hmm... Maybe, I'll start a new thread about best volume size for disk based storage, which needs to be rsynced to somewhere else. Back to my problem. It seems to me, that a few migrate jobs are in order; my understanding of the migrate job process is as follows Bacula reads old job from the pool Bacula stores this data in new volumes Bacula 'deletes' the old job? Once I've moved all the jobs off this massive 750G volume, then I can mark this volume to be recycled, and it should rewrite this volume, and cap it at 50G? -- The Solo System Admin - Follow me - I follow you http://solosysad.blogspot.com/ Latest Entry: Bacula - Building from source -- Mister IT Guru At Gmx Dot Com ------------------------------------------------------------------------------ Protect Your Site and Customers from Malware Attacks Learn about various malware tactics and how to avoid them. Understand malware threats, the impact they can have on your business, and how you can protect your company and customers by using code signing. http://p.sf.net/sfu/oracle-sfdevnl _______________________________________________ Bacula-users mailing list Bacula-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/bacula-users