Greetings!

I'm relatively new to Bacula, and am working on my first deployment.  I am 
working in a completely disk-based environment, so I have a single director and 
SD which run on the same host, and a few Windows test server clients which run 
FD's.

Because it's all disk based, I'm using compression=gzip4 in my fileset 
directives.  I have two questions regarding compression:


1)        What is the percentage that is reported in the job statistics?  Is 
that how well the compression is doing, or how much CPU it took to do it?  I've 
seen references to both..

2)      Is there an easy way in bconsole to see what the actual size on disk of 
the job was?  All of the size numbers reported seem to be post-compression, 
which makes sense since the compression happens on the host, but I'm assuming 
bacula is storing ACTUAL size on disk info in the catalog somewhere.. I'd love 
to quickly see how much REAL data I'm backing up, compared to how much virtual 
tape space it's taking..

Thanks!

Joseph Dickson
AJ Boggs
joseph.dick...@ajboggs.com

------------------------------------------------------------------------------
Download Intel® Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to