I'm having the same problem has everyone else. Our imaging system is going to be 30millioin+ files over 3 disks and 2TB by the end on the year. I was going to attempt to use a snapshot image backup of each filesystem which is on a Win2k server. Has anyone tried this before on a server of this size and were you successful? I'm not very confident on doing this on the windows platform.
I understand the problems associated with backing up a server with millions of files and a badly written application that stores everything in one place on one drive but I need a solution. -----Original Message----- From: ADSM: Dist Stor Manager [mailto:[EMAIL PROTECTED] On Behalf Of Ted Byrne Sent: Saturday, June 18, 2005 4:57 PM To: ADSM-L@VM.MARIST.EDU Subject: Re: 15,000,000 + files on one directory backup I would second Bill's addition of poorly-architected applications to Richard's list of issues that should be (but are often not) addressed, or even considered. At another customer, we and the customer's sysadmins were bedeviled by a weblog analysis application (which shall remain nameless) that chose to store its data on the filesystem, using the date of the log data as a directory under which the data was stored (as well as the associated reports, I believe). The explanation we were given was that they had chosen to do this for application performance reasons; it was apparently quicker that using a DBMS. This decision, although it made random access of data quicker, had horrible implications for backup as the log data and reports accumulated over time; recovery was even worse. Aggravating the situation was the insistence by the "application owner" that ALL historical log data absolutely had to be maintained in this inside-out database format. Just getting a count of files and directories on this drive (via selecting Properties from the context menu) took something on the order of 9 hours to complete. The volume of data, in GB, was really not that large - something on the order of 100 GB. All of their problems managing the data stemmed entirely from the large number of files and directories. When the time came to replace the server hardware and upgrade the application, they had extreme difficulty migrating the historical data from the old server to the new. They did finally succeeded in copying the data from the old server to the new, but it took days and days of around-the-clock network traffic to complete. Addressing the ramifications of this type of design decision after the fact is difficult at best. If at all possible, we need to prevent it from occurring in the first place. Ted **************************EMAIL DISCLAIMER*************************** This email and any files transmitted with it may be confidential and are intended solely for the use of the individual or entity to whom they are addressed. If you are not the intended recipient or the individual responsible for delivering the e-mail to the intended recipient, any disclosure, copying, distribution or any action taken or omitted to be taken in reliance on it, is strictly prohibited. If you have received this e-mail in error, please delete it and notify the sender or contact Health Information Management 312.413.4947.