I am writing a script in perl to backup my system, and would like to run
a backgroud process to burn each CD as it is created. 

Right now I use this command

my $fail=system "tar -vr -T $files_to_back -f $tar_file";

to create a tar file. If the tar file is bigger than 650 M, then I will
have to use split to split it into chunks. Needless to say, if backing
up my whole hardrive, I will have many chunks. In fact, if my hardrive
contains 10 G of info, I would need 10G of extra space just to run my
script.

So I want to create a background process. (I believe this is what I have
to do, anyway.) I want tar to create 650M of info, and then stop while I
create a disk image, burn the image, and then remove the image.

I have looked in *Perl Cookbook,* but I couldn't really find any way to
do this. 

I believe doing what I want is possible. There is a relativley simple
script called backuponcd that does just this. But the script is written
as a bash script, and I can't quite figure out what is going on.

Thanks

Paul

PS I feel like I am re-inventing the wheel. I am sure there are a million
good scripts and programs out there to backup. But I either can't get
them to run, or they don't quite offer quite the ability to customize
that I want. 

I would like the ability to append new files to old ones.
For example, if I am working on a document called "my_story.txt", I
will edit this story every day for several weeks. I want each version to be on a 
CD--in other words, there would be 21 copies of this story if I edited every day for 
three weeks. After all, I might do some bad editing on day 18 and really wish that I 
had a copy of the story that I did on day 15. 


Anyone know of a *well-documented* perl script that does what I want? 

-- 

************************
*Paul Tremblay         *
*[EMAIL PROTECTED]*
************************

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to