Josh Tolbert wrote:
On Thu, Mar 23, 2006 at 12:31:11PM -0800, Dennis Peterson wrote:
I'm testing your method now, but it's still running so I don't have an answer.
dp
Thanks for taking a look. I only have my one little OS X Server machine here,
and I didn't set it up...
Josh
It does as you say. You may get around it using a tool like tripwire to
limit your scan to the files of interest. Really, scanning every file on
a system disk is draconian. User space is another thing entirely and
your own best practices should prevail. Here's an alternative to your
quandry:
tar cf - / |/usr/local/bin/clamscan
Use your path to clamscan. Tar does not follow symlinks that are
directories, and this usage invokes a single instance of clamscan which
is used for the entire scan. Tar also allows include and exclude files
which can help you fine tune your scope (man tar). If you run tar as
root then you get around all the messy permission problems. This is
processor intense, regardless, especially when scanning gzip'd tar files
or other archives. Tar itself is not much of a factor. But this too is
nuts. Do a global scan once to baseline things, then scan only changed
files thereafter. Build a new baseline every 45 days or so. When you
work smart your boss loves it and rewards you, and you are an instant
babe magnet. Better than owning a Harley. Ok - I made up that last part.
Adding options to clamscan and to tar to ignore benign file types can be
a plus. If there's a problem then tripwire would be your first line
anyway. Not enough people use tripwire or cfengine which has a tripwire
like capacity and runs just fine in OS X. Far less overhead than virus
scanning the entire system over and over. Just remember to keep your
masters on a read-only CD or nfs mount.
dp ... Harley owner
_______________________________________________
http://lurker.clamav.net/list/clamav-users.html