Thanks for the suggestion. We have tried it. Same results. Things just go to sleep !
"Mark D. Rodriguez" <[EMAIL PROTECTED]> Sent by: "ADSM: Dist Stor Manager" <ADSM-L@VM.MARIST.EDU> 03/28/2005 05:30 PM Please respond to "ADSM: Dist Stor Manager" <ADSM-L@VM.MARIST.EDU> To ADSM-L@VM.MARIST.EDU cc Subject Re: [ADSM-L] Large Linux clients Zoltan, I am not sure if this will fix the problem or not. I have seen in the past when trying to backup directories (including sub-directories) with a large number of files that the system runs out of memory and either fails or hangs for ever. The one thing that I have done and has worked in some cases is to use the MEMORYEFfecientbackup option. It is a client side option and can be placed in the option file or called from the command line. I would try it and see if it helps. BTW, there is a downside to this and that is that backups will be slow however slow is still faster than not at all! Let us know if that helps. -- Regards, Mark D. Rodriguez President MDR Consulting, Inc. =============================================================================== MDR Consulting The very best in Technical Training and Consulting. IBM Advanced Business Partner SAIR Linux and GNU Authorized Center for Education IBM Certified Advanced Technical Expert, CATE AIX Support and Performance Tuning, RS6000 SP, TSM/ADSM and Linux Red Hat Certified Engineer, RHCE =============================================================================== Zoltan Forray/AC/VCU wrote: >I am having issues backing up a large Linux server (client=5.2.3.0). > >The TSM server is also on a RH Linux box (5.2.2.5). > >This system has over 4.6M objects. > >A standard incremental WILL NOT complete successfully. It usually >hangs/times-out/etc. > >The troubles seem to be related to one particular directory with >40-subdirs, comprising 1.4M objects (from the box owner). > >If I point to this directory as a whole (via the web ba-client), and try >to back it up in one shot, it displays the "inspecting objects" message >and then never comes back. > >If I drill down further and select the subdirs in groups of 10, it seems >to back them up, with no problem. > >So, one question I have is, anyone out there backing up large Linux >systems, similar to this ? > >Any suggestions on what the problem could be. > >Currently, I do not have access to the error-log files since this is a >protected/firewalled system and I don't have the id/pw. > > >