* jcastanet <[EMAIL PROTECTED]> [20050604 10:52]: wrote:
> this is the answer of andreas mueller, the person who have created
> mod_clamav 
> 
> as you can see, he told that this problem is a matter of Clamav !!!!!!
> 
> That why I have created a thread on this forum
> 
> You said that this problem is not the matter of Clamav.
> 
> Andreas Muller said this problem is not the mater of Mod_clamav
> 
> We are playing PING-PONG game !!!!!!!!!!!!!!!!!!!!!!!!!!!!
>  
> 
> ################## andreas Mueller wrote to me #######################"
> 
> Am 13.05.2005 um 15:49 schrieb Jean Philippe (EXT):
> 
> *       is it possible to configure mod_clamav or clamav (0.85) to detect
> virus that   are compressed many time.
> 
> This  is  entirely  a  problem of clamav, as mod_clamav hands the file over
> without modification to clamav for checking. Clamav  is supposed to do any
> extractions, maybe you need to modify the level to which recursion is done
> (this is, if I remember  correctly, a configuration parameter of clamav).
> 
> On Fri, 3 Jun 2005 11:17:18 +0200
> "jcastanet" <[EMAIL PROTECTED]> wrote:
> 
> > hi,
> > 
> > I'm using mod_clamav to scan HTTP download, everything work very well.
> > 
> > However ... When a virus is Zipped two time or more, the virus is not
> > detected in the ziped file.
> > 
> > The virus is detected only if it have been compressed one time.
> > 
> > Mod_clamav use libclamav to detect virus.
> 
> Report this problem to mod_clamav folks and not here.
> 
> > When I scan the same file ( compressed many time) directly with the
> > command clamscan or clamdscan the virus is detected.

Let's take another look at it. You say that if you scan the file many
times with clamscan or clamdscan then the virus is detected. Do you mean
that you decompress the file, scan that, decompress again what you got,
scan again, etc, etc, or what exactly do you mean by many times?

Can you put that file somewhere on a publicly accessible website and let
us download it.

When Dr. Andreas Muller responded to you, he mentioned that

<quote Muller>

...........................................Clamav  is supposed to do any
extractions, maybe you need to modify the level to which recursion is
done (this is, if I remember  correctly, a configuration parameter of
clamav)

</quote>

So let's think about it again. In clamd.conf, we have the following
config parameters in the section dealing with the scanning of archives
(compressed files, yes?), and I paste here verbatim:

PS: I am running CVS here, and it has the new config parser, so don't
be surprised with the yes and no you see here!

<begin>

## Archives
##

# ClamAV can scan within archives and compressed files.
# Default: yes
ScanArchive yes

# The options below protect your system against Denial of Service attacks
# using archive bombs.

# Files in archives larger than this limit won't be scanned.
# Value of 0 disables the limit.
# Default: 10M
ArchiveMaxFileSize 10M

# Nested archives are scanned recursively, e.g. if a Zip archive contains a RAR
# file, all files within it will also be scanned. This options specifies how
# deep the process should be continued.
# Value of 0 disables the limit.
# Default: 8
ArchiveMaxRecursion 8

# Number of files to be scanned within an archive.
# Value of 0 disables the limit.
# Default: 1000
ArchiveMaxFiles 1000

# If a file in an archive is compressed more than ArchiveMaxCompressionRatio
# times it will be marked as a virus (Oversized.ArchiveType, e.g. Oversized.Zip)
# Value of 0 disables the limit.
# Default: 250
ArchiveMaxCompressionRatio 250

# Use slower but memory efficient decompression algorithm.
# only affects the bzip2 decompressor.
# Default: no
#ArchiveLimitMemoryUsage yes

# Mark encrypted archives as viruses (Encrypted.Zip, Encrypted.RAR).
# Default: no
#ArchiveBlockEncrypted no

# Mark archives as viruses (e.g. RAR.ExceededFileSize, Zip.ExceededFilesLimit)
# if ArchiveMaxFiles, ArchiveMaxFileSize, or ArchiveMaxRecursion limit is
# reached.
# Default: no
#ArchiveBlockMax no

</end>


So, my question now is: Have you given those parameters a good thought
when handling that archive you are having trouble with?

Under ideal conditions, why would someone want to compress an archive so
many times, as to have an archive within an archive within another
archive ad nauseam (spelling?), while specifying a compression ratio
(yes, it can be specified for some archivers) should actually do a good
enough job?

For you, you should try a good value for ArchiveMaxRecursion.
For me, I'd simply use what we have by default - ArchiveMaxRecursion and
ArchiveBlockMax, and go for beer ;-)

Again, that is my humble opinion and you should not imagine crucifying
me for my opinion.

Wash now leaves for the beer dens....


-Wash

http://www.netmeister.org/news/learn2quote.html

--
+======================================================================+
    |\      _,,,---,,_     | Odhiambo Washington    <[EMAIL PROTECTED]>
Zzz /,`.-'`'    -.  ;-;;,_ | Wananchi Online Ltd.   www.wananchi.com
   |,4-  ) )-,_. ,\ (  `'-'| Tel: +254 20 313985-9  +254 20 313922
  '---''(_/--'  `-'\_)     | GSM: +254 722 743223   +254 733 744121
+======================================================================+
Reality is just a convenient measure of complexity.
                -- Alvy Ray Smith
_______________________________________________
http://lurker.clamav.net/list/clamav-users.html

Reply via email to