Am 09.08.2016 um 18:40 schrieb G.W. Haywood:
Does anybody have any feedback on the proposed solution to scanning large files in chunks?Stop worrying about it, it's a waste of time and effort. The probability that you will actually find what you're looking for is very small.... are there any reasons that the method wouldn't work for all file types, assuming that the initial bytes of the file are prepended to each chunk so that ClamAV knows what type of file it is?Yes. Because of what I wrote above. Forget prepended bytes and fancy ways of doing things that won't solve the problem. Look at the problem in a different way. I'm sure this isn't what you want to hear, but it's the way things are.
on the other hand contentscanner limits are often around 260 KB with the justification of "scantime maybe so much higher and nobody is sending such large spam because of the sendrate" - in the meantime it's common to have junk with a large attachment to just bypass the filter
back to clamav:and if you scan "only" 20 MB files and not gigabytes - why should it be a goal to allocate that memory at once to make troubles in case of *parallel scans* like on a mailserver
that and that the meory usage of the signatures is already terrible shows that clamav *will have* to deal with that problems in the future since it#s already the main memory consumer on a inbound mailserver
signature.asc
Description: OpenPGP digital signature
_______________________________________________ Help us build a comprehensive ClamAV guide: https://github.com/vrtadmin/clamav-faq http://www.clamav.net/contact.html#ml