"G.W. Haywood via clamav-users" <clamav-users@lists.clamav.net> wrote: > > There are ways around that, even if you don't want to run clamdscan > (and clamd) as root - which I'd entirely understand.
Is --fdpass one of them? And --stream? Any others? > >We've got about 3000 Linux systems that we'd like to periodically scan, > >primarily to ensure that they're not being used to redistribute > >Windows malware. > > A good use case, perhaps quite a tall order with a single clamd server > but maybe doable if you can (a) limit what needs to be scanned and (b) > define 'periodically' in terms of days (at least) and not hours. We could use multiple clamd servers. And periodically in terms of days would be OK. > >Any attempt to skip "junk" will potentially skip malware, and hand > >crafting scans for each system is not an option. > > That seems more like a management problem to me than a technical one. The nature of our environment precludes actively managing all of the Linux systems here, unfortunately. > >Skipping multiple copies of the same file won't really help because > >the duplication is across systems, and because every file will be > >rescanned every time clamscan is run. > > That's not true of clamdscan. Hmm...that's promising. I'll give it a try. > And you probably won't know what's been modified in the past week unless > you install Tripwire or something like that... mtime would be sufficient for our purposes. -Dave _______________________________________________ clamav-users mailing list clamav-users@lists.clamav.net https://lists.clamav.net/mailman/listinfo/clamav-users Help us build a comprehensive ClamAV guide: https://github.com/vrtadmin/clamav-faq http://www.clamav.net/contact.html#ml