Charlie Reiman wrote: > Just to add a random thought here: > > IIRC gzip has an internal checksum and can be validated quickly. You can, if > you have lots of time and this is life & death, essentially use brute force > attack on your candidate sector lists. For example if the file should be 10 > clusters long and you have 25 candidates, it should be easy to generate the > (uh... 25 choose 10...) 3268760 candiate lists, then test the checksum on > each one (essentially "gzip -c testfile > /dev/null || echo 'nope. next!') . > This would involve script fu, use of dd, and heavy amout of disk IO but it > isn't rocket science. This would be a last resort effort since it could take > a long time and the most obvious linear candidates will probably be correct.
Interesting thought. But he said the file was a "few megs", and the largest FAT32 cluster size is 32k (64 512-byte sectors). Let's say the file is 2 MB, which is probably a low estimate for a "few megs". 2 MB / 64k = 32 clusters. And that's the best case; depending on the size of the partition, he may have 16k, 8k, or even 4k clusters, possibly increasing the cluster count by up to 8 times (2 MB / 4k = 512 clusters). And the file, again, may be much larger than the example I'm using here. > You should also check the file format for gzip. The first sector will have > the magic number and possibly the length. There might also be some > recognizable magic in the last sector as well. The first cluster is easy to find; the directory entry specifies it. But it's worth checking for magic numbers or whatever just to be sure it wasn't overwritten by something else after being deleted. If you're missing the first sector of a single gzipped file, you can probably forget about recovering it, I would guess. Craig -- To UNSUBSCRIBE, email to [EMAIL PROTECTED] with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]