Jorge, Have you tried looking into the savevg or backup command?
Thanks Vince -----Original Message----- From: ADSM: Dist Stor Manager [mailto:ADSM-L@VM.MARIST.EDU] On Behalf Of Jorge Amil Sent: Thursday, February 02, 2012 10:06 AM To: ADSM-L@VM.MARIST.EDU Subject: EXTERNAL: Re: [ADSM-L] million files backup Hi Jim, thank you very much for your answer. Actually we are doing what you say. Filesystem .tar. It was a great solution when the filesystem was 500Gb-1Tb but actually our filesystem is 14Tb. The tar/gzip command took 10-12 days... :( So we need another aproach Thanks Jorge > Date: Thu, 2 Feb 2012 08:47:24 -0600 > From: jschnei...@ussco.com > Subject: Re: [ADSM-L] million files backup > To: ADSM-L@VM.MARIST.EDU > > Jorge, > > On Unix systems: > I've done it in two steps. Create a tar file of the file system and zip > it. Create a second file listing all the files in the tarred directory. > The tar extract command allows single files to be recalled if the > absolute path name is available. > > I've used this to backup a Sterling Commerce flat file database with > multiple subdirectories holding more than 1.5 million files. The > tar/gzip command took 5 or 6 hours for a 500 GB file system. > > I have not tried to do this on a Windows system. > > Jim Schneider > > -----Original Message----- > From: ADSM: Dist Stor Manager [mailto:ADSM-L@vm.marist.edu] On Behalf Of > Jorge Amil > Sent: Thursday, February 02, 2012 8:30 AM > To: ADSM-L@vm.marist.edu > Subject: [ADSM-L] million files backup > > Hi everybody, > > Does anyone know what is the best way to make a filesystem backup than > contains million files? > > Backup image is not posible because is a GPFS filesystem and is not > supported. > > Thanks in advance > > Jorge >