Hello List,
I'm a bit puzzled I've 3 different kind of jobs with 3 different
storage, different mediatype, different fileset.
My max number of concurrent job is 20, but I can have only 2 concurrent
job (each of one type), if I start the third one it will wait or block
the two others.
Is there
Hello,
For reason of space and time I would like to limit the size, is there a
way to make a job fail if the stored size is more than xx bytes (note: I
do the storage on disks).
Matthieu.
--
ThinkGeek and WIRED's GeekD
Hello,
I tried baculat 5.0.1 and I'm getting this error when I try a full backup:
srvbasu01-dir Max Volume jobs exceeded. Marking Volume "Full-0036" as Used.
srvadet01-fd ERROR in smartall.c:124 Failed ASSERT: nbytes > 0
srvbasu01-dir Fatal error: Network error with FD during Backup: ERR=No
data