On Jan 24, 2006, at 2:22 PM, Jeff wrote:
This example that Francesco illustrates seems to work pretty well. I
guess my main concern was with tar - would it be able to handle a
filesystem this large? Myself, I haven't seen or heard any scary
stories
thus far. Anyone shed light on tar limitations?
all of tar's limitations will have to do with the output file.
typically, you'll run into problems at 2 gigs on some old kernels, or
*nix variants. tar used to have a limit of like 8 gigs or so,
assuming the underlying kernel/filesystem would allow it....but I
haven't tried to push that limit in quite a while. For instance, 4
gigs is where you crash if writing to a fat32 partition.
Thanks for all the colorful replies.
:-)
-Jeff
Francesco Riosa wrote:
Jeff wrote:
Hey guys.
I've got this big fat backup server with no space left on the
hard drive
to store a tar file. I'd like to pipe a tar through ssh, but not
sure
what the command would be. Something to the effect of:
# cat /var/backup | ssh backup.homelan.com 'tar data.info.gz'
So that, the data is actually being sent over ssh, and then
archived on
the destination machine.
tar -zcf - /var/backup | ssh backup.homelan.com "( cat >
data.info.gz )"
something similar, probably is possible to avoid the use of cat bat
don't came in mind at the moment
--
Officer:
We've analyzed their attack, sir, and there is a danger.
Should I have your ship standing by?
Governor Tarkin:
Evacuate? In our moment of triumph? I think you
overestimate their chances.
--
gentoo-user@gentoo.org mailing list
--
gentoo-user@gentoo.org mailing list