I run a script from root's crontab (not /etc/crontab) and keep the
login credentials in /root/.my.cnf so they don't have to be embedded
in the script. Not that $gzip is defined as /bin/cat because I move
copies offsite via rsync and disk space is abundant. This script keeps
30 daily backups (configurable).
Crontab entry:
13 20 * * * cd /bak/databases && /root/db_backup
"db_backup" perl script:
#! /usr/bin/perl
use strict;
my $maxbackups = 30;
my $gz='gz';
my $mysqldump = '/usr/local/bin/mysqldump';
my $gzip = '/bin/cat';
my $newfile;
my $filename = 'all_databases.sql';
my $curfile = $filename . ".$maxbackups";
unlink $curfile if -f $curfile;
my ($i, $j);
for ($i = $maxbackups - 2; $i >= 0; $i--) {
$j = $i + 1;
$curfile = $filename . '.' . $i;
$newfile = $filename . '.' . $j;
rename $curfile, $newfile if -f $curfile;
}
$curfile = $filename . '.' . '0';
my $command = "$mysqldump --opt --all-databases | $gzip > $curfile";
my $result;
$result = system $command and warn "$result";
On Sep 30, 2008, at 4:22 PM, John Almberg wrote:
DATE=`date +%a`
#
echo $DATE
#
echo Backup Mysql database
mysqldump -h localhost -u YOURSQLUSERID -pYOURPASSWORD YOURDATABASE
>/usr/somedirectory/somefile_$DATE.backup
gzip -f /usr/somedirectory/somefile_$DATE.backup
/usr/bin/at -f /usr/somedirectory/mysqlbackup.sh midnight
Ah, a much simpler solution than my ruby script. I hadn't thought to
zip up the file before transferring it. That's an improvement I must
add.
Thanks: John
_______________________________________________
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to "[EMAIL PROTECTED]
"
_______________________________________________
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to "[EMAIL PROTECTED]"