Good morning
the problem is bound to the dimension of the backup file on the filesystem .
On ext3fs the size of a file is maximum 2.50 .
So here the limitation, my the dir contain a number of subdirs or files or what a people want that is greater than that size, my backup don't contain all files of the directory, so i need to devide the dorectory content and backup this before and later backup a next part and for now one to time .
the 3 modules that are in the script allow me to execute this script with multi process for to speed up the backup time, but this another story!
How can handle a portion more light of my big dir ?? .
Thanks in advance .


Goffredo Saffioti .


#!/usr/bin/perl use strict; use warnings;

use IPC::Open2;
use Term::ANSIColor;
use POSIX ":sys_wait_h";



my $MOUNT_DIR = "/mnt/test";
my $BACKUP_FILE = "Testbackup.cpio";
my $FULL_LOG_FILE = "Testlog.log";


my $backup_dir = '/home/user'; my $running_output = '/tmp/im_running.txt';


my $OUTPUT; open($OUTPUT, ">>$running_output") or die "Can't open output file: $!";


my $DIRHANDLE; opendir($DIRHANDLE, $backup_dir) or die "can’t opendir $backup_dir: $!"; while (my $file = readdir($DIRHANDLE)) { next if (($file eq '.') || ($file eq '..'));


unless (-d $file) { print $OUTPUT "Skipping non-directory: $file\n"; next; } unless (-r $file) { print $OUTPUT "Skipping unreadable directory: $file\n"; next; }


print $OUTPUT "Backing up starting at $file...\n";



my @find_output = `/usr/bin/find $file -print -depth -mount -xdev |/usr/bin/afio -oulvAZ/ -T3k -s 2000m -L$MOUNT_DIR/$FULL_LOG_FILE $MOUNT_DIR/$BACKUP_FILE`;
# do some checking of $? here...



print $OUTPUT "Back up of $file complete.\n"; } closedir $DIRHANDLE;


print $OUTPUT "I am outta here....\n";



close($OUTPUT);





-- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]



Reply via email to