ok =) right. i had some problems that me have the same solution. i need a backup solution with the following features:
- smallest network traffic - smallest load on remote machines - preserved file permissions - only small configuration on remote machines - restoring data until 7 days ago (7 day generation) - root login on remote machines is disabled is there any solution? =) i think there have to be a way like this: rdiff-backup over ssh and run the remote rdiff-backup with sudo. but how? thanks a lot On Wed, 2006-03-22 at 14:16 +0100, Joachim Schipper wrote: > On Wed, Mar 22, 2006 at 01:24:33PM +0100, Marco Fretz wrote: > > hello > > > > i had a well known problem, but no idea how to build a "correct" > > solution. > > > > we have a lot of linux and bsd servers at our isp. i have to backup data > > from these systems to a remote system. > > > > the backup server (storage server) has access to remote systems (data > > sources) over ssh and public key auth (a user named backup exists on all > > systems). > > > > my problem, this user has no access to some files > > in /etc, /usr/local/etc, and so on. so what to do? > > > > i build a script that runs locally on all system over root cronjob. the > > script backs up all data in a tar file on the local machine. > > > > my problem now i had to transfer the whole tar archive (on some systems > > over 8 GB) to my backupserver every day twice (packup period). > > > > is there a way with rdiff-backup to start rdiff-backup on remote machine > > with root access (by sudo)? > > Probably, but is the basis of the problem not that the backup user has > insufficient priviliges? > > It might be a better solution to use public-key authentication for root, > limited to executing a single command. The rdiff-backup docs seem to > include this solution. > > Joachim