Am 20.06.2016 um 22:01 schrieb Larry Irwin (gmail):
The scripts I use analyze the rsync log after it completes and then sftp's a summary to the root of the just completed rsync. If no summary is found or the summary is that it failed, the folder rotation for that set is skipped and that folder is re-used on the subsequent rsync. The key here is that the folder rotation script runs separately from the rsync script(s). For each entity I want to rsync, I create a named folder to identify it and the rsync'd data is held in sub-folders:
daily.[1-7] and monthly.[1-3]
When I rsync, I rsync into daily.0 using daily.1 as the link-dest.
Then the rotation script checks daily.0/rsync.summary - and if it worked, it removes daily.7 and renames the daily folders. On the first of the month, the rotation script removes monthly.3, renames the other 2 and makes a complete hard-link copy of daily.1 to monthly.1 It's been running now for about 4 years and, in my environment, the 10 copies take about 4 times the space of a single copy.
(we do complete copies of linux servers - starting from /)
If there's a good spot to post the scripts, I'd be glad to put them up.

Hi Larry,

that is something i couldn`t do with my current scripting skills but it sounds very interesting and i really would like to know how you did it if you don`t mind showing me your script of course.
As for my script, this is what i came up with.

#!/bin/sh

# rsync copy scriptv2 for rsync pull from FreeNAS to BackupNAS

# Set Date
B_DATE=$(date +"%d-%m-%Y-%H%M")
EXPIRED=`date +"%d-%m-%Y" -d "14 days ago"`

# Create directory if it doesn`t exist already
if ! [ -d /volume1/Backup_Test/in_progress ] ; then
        mkdir -p /volume1/Backup_Test/in_progress
fi

# rsync command
if   [ -f /volume1/rsync/Test/linkdest.txt ] ; then
        rsync -aqzh \
        --delete --stats --exclude-from=/volume1/rsync/Test/exclude.txt \
        --log-file=/volume1/Backup_Test/logs/rsync-$B_DATE.log \
--link-dest=/volume1/Backup_Test/`cat /volume1/rsync/Test/linkdest.txt`\
        Test@192.168.2.2::Test /volume1/Backup_Test/in_progress
else
        rsync -aqzh \
        --delete --stats --exclude-from=/volume1/rsync/Test/exclude.txt \
        --log-file=/volume1/Backup_Test/logs/rsync-$B_DATE.log \
        Test@192.168.2.2::Test /volume1/Backup_Test/in_progress
fi

# Check return value
if [ $? = 24 -o $? = 0 ] ; then
        mv /volume1/Backup_Test/in_progress /volume1/Backup_Test/$B_DATE
        echo $B_DATE > /volume1/rsync/Test/linkdest.txt
fi

# Delete expired snapshots (2 weeks old)
if [ -d /volume1/Backup_Test/$EXPIRED-* ]
then
rm -Rf /volume1/Backup_Test/$EXPIRED-*
fi

Keep in mind i am not very good at this and if something can be improved or you see a big flaw in it, i would be grateful if you let me know. So far it seems to do the trick. I would like to improve it so that the logfile will be mailed to a specific e-mail adress after rsync completed successfully. Unfortunately the logfiles grow very big when i have lots of data to back up and i couldn`t figure out how to only send a specific part of the logfile or to customize the logfile somehow.



--
Please use reply-all for most replies to avoid omitting the mailing list.
To unsubscribe or change options: https://lists.samba.org/mailman/listinfo/rsync
Before posting, read: http://www.catb.org/~esr/faqs/smart-questions.html

Reply via email to