Mike McCarty wrote:
David Fox wrote:On Sun, Sep 14, 2008 at 1:16 PM, Mag Gam <[EMAIL PROTECTED]> wrote:So, what do you recommend for such an annoyance? rsync takes a while for me. But I rather I have 1 large tar file and untar as needed.tar isn't the best tool to use for the job, especially if you need 1 file out of the tarball because it has to sequentially go through the tar file until it hits the end. Your desired file might be at theOr you hit ^C :-)beginning, but it could be towards the end of the tarball. In anyTAR was originally intended to work with tape drives, some of which are incapable of backward indexing, and must be rewound, so that's the way it behaves.event, having to stat / open (other file i/o) on 30K files is your bottleneck, and in cases like this, rsync would fare better, since it doesn't have to copy all the files, only ones that have changed. Of course, the first run will take more time.What tool do you recommend for his application? IOW... He wants a single largeish file/archive which is quickly searchable. Is cpio a better tool for this use? Mike
No, cpio does the same thing, sequential access. It, to, was developed in the tape backup days, IIRC.
In any case, if random access is needed to files in the archive, I'm not sure which, if any, backup program would do this. Perhaps backup/restor?
But perhaps an iso or other disk image format would work better? The file would be mounted using to loop device, updated with rsync, and randomly accessed as needed.
The mount command would be: mount -t iso9660 -o loop /path/to/image/file.iso /mnt Then, use files or rsync files between the /mnt the the "real" work area. -- Bob McGowan
smime.p7s
Description: S/MIME Cryptographic Signature