Hi all,
    First thanks for the responses.  And sorry for being so vague.  But
let me explain what I have done and why.
I wrote a backup script that logs progress/errors/ and any other useful
information under the sun to a mysql database.

    The script runs from a central backup server and backups up several
servers.  when the script starts it forks (right now I have it set to 5)
and does 5 servers at one time.  When one finishes it grabs the next one
and continues as such.  

   What I was running into is when I was backing up the server I was
tarring each individual user to standard out then writing that standard
out to a gzwrite object on the backup server.  This way, the zipping of
the archive was not done on the client box and I did not need any disk
space requirement on the client box as well.  I did not want to tar on
client box and scp tar ball over.  To many downfalls in that.

   But if I was tarred 5 large users from 5 different servers, I ran out
of memory on the backup server and crashed the server.  I have found
that perl, once it uses memory it does not release it until the script
dies.  Well this is not good.  

  So what I did was spawn a child each time I went to tar a user from a
client server.  Thus, when the user was finished tarring the child would
exit and that memory would then be release from the script so that other
processes on the box could use it.  Thus almost eliminating the issue
with using up all the memory.  The only issue now is if I have to back
up a user on a client box that is larger the memory on the backup
server.  But I have a few ideas for that.

   So the issue I was having when I posted  was that each of the $items
in the @array were being spawned at once.  I thru a while loop in there
and fixed that issue.  I knew it was something stupid I was over
looking.

   Sorry for not explaining everything thoroughly in my email.  I will
make sure that any other post will be more   descriptive.

   Thanks for all the help.  I greatly appreciate it.

Sincerely,
Chad 





On Wed, 11 Sep 2002 11:29:34 -0800
Michael Fowler <[EMAIL PROTECTED]> wrote:

> On Wed, Sep 11, 2002 at 12:17:55PM -0400, Chad Kellerman wrote:
> > my @list = "bla, bla, bla, bla, bla";
> 
> You probably meant @list = ("bla", "bla", "bla", "bla", "bla");
> 
> 
> > foreach my $item(@list) {
> >         my $pid;
> >         FORK: {
> >             if ($pid=fork) {
> >                    print"$pid\n";
> >             }elsif (defined $pid){
> >              #do other perl stuff to $item
> >             exit 0;
> >             } elsif ($! =~/No more process/) {
> >             sleep 5;
> >             redo FORK;
> >             }
> >        } 
> > }
> > 
> >    What I want to do, is have the fork, process $item before it goes
> > onto the next $item.  But as it is written it forks every $item in
> > the@list.
> > 
> >    Just fork, finish, repeat until all $item are done.
> 
> What is the point of forking if you're waiting for the sub-process to
> complete before continuing on?
> 
> The solution, of course, is to wait, see perldoc -f wait and perldoc
> -f waitpid.  Another solution is to remove the fork altogether, and
> simply process the item in the parent.
> 
>  
> Michael
> --
> Administrator                      www.shoebox.net
> Programmer, System Administrator   www.gallanttech.com
> --
> 
> -- 
> To unsubscribe, e-mail: [EMAIL PROTECTED]
> For additional commands, e-mail: [EMAIL PROTECTED]
> 

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to