For some reason, my backup is not rotating the files fast enough and my backup
drive is full. I would like Retention of three months (One full backup) and
weekly incrementals. The drive to be backed up is 800GB while the backup drive
itself is 1.8TB.
Is this the best way to do this or am I configured incorrectly?
Here is my bacula-dir.conf:
Director { # define myself Name = als-dir DIRport
= 9101 # where we listen for UA connections QueryFile =
"/etc/bacula/scripts/query.sql" WorkingDirectory = "/var/lib/bacula"
PidDirectory = "/var/run/bacula" Maximum Concurrent Jobs = 1 Messages =
Daemon DirAddress = 127.0.0.1}JobDefs { Name = "DefaultJob" Type = Backup
Level = Incremental Client = als-fd FileSet = "Full Set" Schedule =
"WeeklyCycle" Storage = File Messages = Standard Pool = File Priority = 10
Write Bootstrap = "/var/lib/bacula/%c.bsr"}Job { Name = "BackupLocalFiles"
JobDefs = "DefaultJob"}Job { Name = "BackupCatalog" JobDefs = "DefaultJob"
Level = Full FileSet="Catalog" Schedule = "WeeklyCycleAfterBackup" # This
creates an ASCII copy of the catalog # Arguments to make_catalog_backup.pl
are: # make_catalog_backup.pl <catalog-name> RunBeforeJob =
"/etc/bacula/scripts/make_catalog_backup.pl MyCatalog" # This deletes the copy
of the catalog RunAfterJob = "/etc/bacula/scripts/delete_catalog_backup"
Write Bootstrap = "/var/lib/bacula/%n.bsr" Priority = 11 #
run after main backup}Job { Name = "RestoreLocalFiles" Type = Restore
Client=als-fd FileSet="Full Set" Storage = File Pool = Default Messages =
Standard Where = /home/BCK/restore}FileSet { Name = "Full Set" Include {
Options { signature = MD5 compression = GZIP } File = /
File = /boot/efi File = /home } Exclude { File = /var/lib/bacula
File = /home/BCK File = /proc File = /tmp File = /.journal File =
/.fsck File = /dev File = /run }}Schedule { Name = "WeeklyCycle" Run =
Full 1st mon at 01:05 Run = Incremental tue-sun at 01:05}Schedule { Name =
"WeeklyCycleAfterBackup" Run = Full mon-sun at 01:10}FileSet { Name =
"Catalog" Include { Options { signature = MD5 } File =
"/var/lib/bacula/bacula.sql" }}Client { Name = als-fd Address = localhost
FDPort = 9102 Catalog = MyCatalog File Retention = 30 days Job
Retention = 6 months AutoPrune = yes }Storage
{ Name = File Address = als # N.B. Use a fully qualified name
here SDPort = 9103 Device = FileStorage Media Type = File}Catalog { Name =
MyCatalog}Messages { Name = Standard mailcommand = "/usr/sbin/bsmtp -h
localhost -f \"\(Bacula\) \<%r\>\" -s \"Bacula: %t %e of %c %l\" %r"
operatorcommand = "/usr/sbin/bsmtp -h localhost -f \"\(Bacula\) \<%r\>\" -s
\"Bacula: Intervention needed for %j\" %r" mail = root = all, !skipped
operator = root = mount console = all, !skipped, !saved append =
"/var/log/bacula/bacula.log" = all, !skipped catalog = all}Messages { Name =
Daemon mailcommand = "/usr/sbin/bsmtp -h localhost -f \"\(Bacula\) \<%r\>\" -s
\"Bacula daemon message\" %r" mail = root = all, !skipped console = all,
!skipped, !saved append = "/var/log/bacula/bacula.log" = all, !skipped}Pool {
Name = Default Pool Type = Backup Recycle = yes #
Bacula can automatically recycle Volumes AutoPrune = yes #
Prune expired volumes Volume Retention = 365 days # one year}Pool {
Name = File Pool Type = Backup Label Format = Local- Recycle = yes
# Bacula can automatically recycle Volumes AutoPrune = yes
# Prune expired volumes Volume Retention = 365 days # one
year Maximum Volume Bytes = 50G # Limit Volume size to something
reasonable Maximum Volumes = 100 # Limit number of Volumes in
Pool}Pool { Name = Scratch Pool Type = Backup}Console { Name = als-mon
CommandACL = status, .status
Thank you,-Al
------------------------------------------------------------------------------
Transform Data into Opportunity.
Accelerate data analysis in your applications with
Intel Data Analytics Acceleration Library.
Click to learn more.
http://pubads.g.doubleclick.net/gampad/clk?id=278785111&iu=/4140
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users