Hi all, After wrestling with a Dell TL4000 in the thread marked "Dell TL4000 labeling timeout", it looks like the autochanger is going to be fine thanks to the efforts of several people, especially Ana, on this list. Moving forward, I'm about to start running jobs to at first backfill a large dataset (about 250TB), and then do daily backups of the dataset. The dataset itself is tens of millions of small, compressed files, so I don't particularly want to back the raw files up in bacula, as the database would likely become quite unhappy with me, so instead I've got a staging directory where I tar up a time-sequence of the files, and then I'll use bacula to back up that file, which is named with the time sequence contained inside. These tapes are to be archived offsite indefinitely.
My questions are this: 1) For the backfill, should I shoot for creating single files about 2.5TB in size to completely fill the tapes? 2) If I make a tar larger then a tape's storage capacity (LTO-6), will bacula automatically span tapes? 3) Given the size of the tars, the serial nature of the backup, and the dedicated nature of the autochanger (this is its only purpose), are there any tuning parameters I can use to speed up the tape writes given the giant few files nature of things? 4) Is there anything about this that seems like a terrible, terrible idea? Thanks, Andrew ------------------------------------------------------------------------------ _______________________________________________ Bacula-users mailing list Bacula-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/bacula-users