Hi all.  I've finally started to get backups of my clients to work right
(except for some issues with NT Backup doing the registry stuff)...but
I'm having a problem backing up the Catalog.  I'm using the default job
tweaked a bit:

Job {
  Name = "BackupCatalog"
  JobDefs = "DefaultJob"
  Level = Full
  FileSet="Catalog"
  Schedule = "WeeklyCycleAfterBackup"
  RunBeforeJob = "/usr/local/share/bacula/make_catalog_backup bacula
bacula"
  RunAfterJob  = "/usr/local/share/bacula/delete_catalog_backup"
  Write Bootstrap = "/var/db/bacula/BackupCatalog.bsr"
  Priority = 11
  Storage = CatalogBackup
  Pool = CatalogBackup
}
JobDefs {
  Name = "DefaultJob"
  Type = Backup
  Level = Incremental
  Client = beryl-fd
  Schedule = "WeeklyCycle"
  Storage = File
  Messages = Standard
  Pool = Default
  Priority = 10
}

The job starts, runs the pgsql dump (and I can see the file), then
stops.  The job status says running, but the file daemon never shows any
bytes written.  For example:

Director connected at: 12-Jul-05 09:48
JobId 189 Job BackupCatalog.2005-07-12_09.41.51 is running.
    Backup Job started: 12-Jul-05 09:41
    Files=0 Bytes=0 Bytes/sec=0
    Files Examined=0
    SDReadSeqNo=5 fd=6

Has anyone seen this before?  Any ideas?

Thanks,
--Brian


-------------------------------------------------------
This SF.Net email is sponsored by the 'Do More With Dual!' webinar happening
July 14 at 8am PDT/11am EDT. We invite you to explore the latest in dual
core and dual graphics technology at this free one hour event hosted by HP,
AMD, and NVIDIA.  To register visit http://www.hp.com/go/dualwebinar
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to