Josh Tolbert wrote:
On Thu, Mar 23, 2006 at 09:07:18PM -0800, Dennis Peterson wrote:
It does as you say. You may get around it using a tool like tripwire to
limit your scan to the files of interest. Really, scanning every file on
a system disk is draconian. User space is another thing entirely and
your own best practices should prevail. Here's an alternative to your
quandry:
tar cf - / |/usr/local/bin/clamscan
Use your path to clamscan. Tar does not follow symlinks that are
directories, and this usage invokes a single instance of clamscan which
is used for the entire scan. Tar also allows include and exclude files
which can help you fine tune your scope (man tar). If you run tar as
root then you get around all the messy permission problems. This is
processor intense, regardless, especially when scanning gzip'd tar files
or other archives. Tar itself is not much of a factor. But this too is
nuts. Do a global scan once to baseline things, then scan only changed
files thereafter. Build a new baseline every 45 days or so. When you
work smart your boss loves it and rewards you, and you are an instant
babe magnet. Better than owning a Harley. Ok - I made up that last part.
Adding options to clamscan and to tar to ignore benign file types can be
a plus. If there's a problem then tripwire would be your first line
anyway. Not enough people use tripwire or cfengine which has a tripwire
like capacity and runs just fine in OS X. Far less overhead than virus
scanning the entire system over and over. Just remember to keep your
masters on a read-only CD or nfs mount.
dp ... Harley owner
Hey dp,
I appreciate the reply and the verification.
I would love to do more intelligent scanning...Unfortunately, my boss doesn't
see things that way and he constantly comes back with, "it worked fine with
Virex!" He wants a full scan.
He seems like a *very* smart man.
I guess I should explain how things worked before. Nightly, /Users gets
scanned. That seems to work just perfectly with clamscan. However, the weekly
scans were done on / and we were excluding /Users...Since it got scanned
earlier that day. This is the same thing Virex was doing, and my boss doesn't
want to move away from that.
Then at least run df -l to see which file systems are worth bothering
with. No point scanning /proc, for example. Do scan /tmp, though.
I had debated using find(1) to do a similar thing (find all real files, pass
them to clamscan, etc.), but the tar method would probably be better. I'd be
interested to see how the reporting works, though. I'll give it a shot
tomorrow.
That would invoke a new instance of clamscan for each file found, no?
You'll probably also want to avoid scanning Unix special files -
devices, named pipes, etc.
I"m curious what's causing clamscan to loop infinitely, though. Also, what's
with the odd double slash at the beginning of each path?
I didn't look that deep into the issue (it is a method I'd never use).
Not sure why the // shows up as it does. I've seen this a lot in OS X.
That and case ignorant file systems. There's some things to not like
about the Unix in OS X. To be fair though, this Mac Mini I'm working
with came prebuilt and does not have a UFS file system installed.
dp
_______________________________________________
http://lurker.clamav.net/list/clamav-users.html