Brian, We use a split approach. I have a 3583 fully populated with 72 LTO volumes holding the data from our 15 largest servers on collocated tapes. I have a 3575 fully populated with 180 3570 volumes holding the data from the remaining 120 "smaller" clients on non-collocated tapes. I have a FILE DevClass disk pool holding folders.
That combination results in acceptably fast restores of files or entire folders on any size machine. In cases of very large restores we're largely bound by the NIC - our TSM server connects via 100 Mbs and one LTO drive can saturate the NIC. Tab Trepagnier TSM Administrator Laitram LLC brian welsh <[EMAIL PROTECTED]> Sent by: "ADSM: Dist Stor Manager" <[EMAIL PROTECTED]> 02/19/2003 02:42 PM Please respond to "ADSM: Dist Stor Manager" To: [EMAIL PROTECTED] cc: Subject: Question about collocation on/off Hello, AIX 5.1, TSM 4.2.2.8 en 3494 ATL with 700 volumes (used about 620 volumes), and about 300 nodes. Three months ago we turned collocation off because we have about 200 client-nodes with less then 3 GB's data stored, and every node was using min. 1 tape, so we had a lot of tapes with low utilization, long migration. Now tape-utilization is much better, migration is much faster, but yesterday there was a restore of an web-server (WinNT) and we restored almost 2 GB and it took 6 hours and there were almost 70 tape-mounts. Before we set collocation off such restores took half an hour. Now we are thinking of setting collocation back on, but that implies we have to buy and checkin about 350 new tapes, because move data will take weeks or more. Now I was wondering how other sites are dealing with collocation, and what are they doing to get better results during restore, when collocation is off. We are thinking of make a full back-up once a month to get data of one client on less (or one) tape(s), or putting collocation back on, or... Curious to your reaction! Thanx, Brian. _________________________________________________________________ MSN Zoeken, voor duidelijke zoekresultaten! http://search.msn.nl