So I've poked around all day trying to track down an overly large backup,
and I've realized something.  I can't seem to find any useful way to get a
list of jobs that are stored in a given volume / media.  Even using direct
SQL, I can't seem to find a join command that will say

Daily-0019 contains jobs client1, client2, etc.

I'm certain there has to be a way, but I can't seem to find it.

Likewise I've also noticed that it's easy to see the total size of a job,
but I can't figure out how to see how much was backed up on a per-partition
basis.  Or how big the files were.  I'd love to be able to do direct SQL to
find out the largest file sizes from a job.  The only relevant bytes I can 
find are for the entire job.  Can someone clue me in how to get more
granularity here?

In short, I had a client with 150gb total disk that was producing 350gb
backups.  This is probably a sparse file, but I can't seem to determine
where this file is...  I ended up making a bunch of separate jobs and using
Estimate on each to find it.  I had hoped to be able to retrieve that info
from the database without having to hack around on the configuration.

-- 
Jo Rhett
senior geek
SVcolo : Silicon Valley Colocation

-------------------------------------------------------------------------
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=120709&bid=263057&dat=121642
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to