Dennis Peterson wrote:
René Bellora wrote:


This sounded like a good idea, so I gave it a try. After spending a few hours to script the softlinks I got it to work for small file lists, but it still doesn't work for lots of files (~5000). When I run 'clamscan /tmp/clamscan/*' I get the following error:

/usr/bin/clamscan: Argument list too long

this could be circumvented with xargs:
cd /tmp/clamscan
find . -type f -print0 | xargs -0 clamscan

Assumes Linux, or at least gnu find and xargs, but also the files are soft links so the -type f automatically fails.

Also, I believe the OP was interested in finding a way to scan all the files from a single invocation of clamscan and xargs won't necessarily do that.

As the requirements have evolved it seems more likely a Perl solution is most attractive both for creating the list and for logging the results. And it will eliminate the earlier suggestion of using soft links. This looks interesting: http://www.fpsn.net/index.cgi?pg=products&product=File::Scan::ClamAV

That's not a bad suggestion. It would take me some time to get that working since I'm a novice at Perl, but it give me a project.

It allows sending files as streams to clamd so there is only a single invocation of perl and clamd is presumed already running.

Finally, it is still possible to hack clamscan to read in a file that contains a list of names of files to scan.


I logged an enhancement request on bugzilla, but I don't expect it will get a very high priority.

--
Chris

_______________________________________________
Help us build a comprehensive ClamAV guide: visit http://wiki.clamav.net
http://lurker.clamav.net/list/clamav-users.html

Reply via email to