Hi, as mentioned previously we have a number of servers where the freshclam
run is failing with result code 137. I just wanted to check if anyone has
seen this and was it a lack of memory causing the failure? We're
interpreting this as a kill -9 from the OS (there is no other facility on
these boxes to issue kill signals). Any suggestions on confirming the cause?

Thanks!

On Wed, Jun 28, 2017 at 9:22 AM, David Pullman <david.pull...@gmail.com>
wrote:

> We've updated the cron script to capture the result code and finding that
> where we are getting the failures, it's consistently 137. I would guess
> this is an OOM situation, but does anyone know if there are other reasons
> we might be getting a 137 from a freshclam run?
>
> Thanks!
>
> David
>
> On Wed, Jun 21, 2017 at 7:38 AM, David Pullman <david.pull...@gmail.com>
> wrote:
>
>> I've tried to duplicate this a few time since we started seeing it on a
>> test instance, but no luck. But we have lots of instances each day in
>> production that are having an error. I've just proposed changing the cron
>> script to capture the result code to log it, as it's not one of the
>> documented codes. Just so we have that info.
>>
>> David
>>
>> On Wed, Jun 21, 2017 at 7:35 AM, David Pullman <david.pull...@gmail.com>
>> wrote:
>>
>>> Yes, there were no new temp dirs left after the successful run. I'm
>>> wondering if it's a time of day network issue, or perhaps a mirror? I've
>>> seen some complaints about a mirror IP that is also in our logs. Don't know.
>>>
>>> David
>>>
>>> On Tue, Jun 20, 2017 at 6:25 PM, Steven Morgan <smor...@sourcefire.com>
>>> wrote:
>>>
>>>> David,
>>>>
>>>> Thanks, so when you say freshclam "completed successfully" you mean
>>>> there
>>>> were no temp files left?
>>>>
>>>> Steve
>>>>
>>>> On Tue, Jun 20, 2017 at 11:21 AM, David Pullman <
>>>> david.pull...@gmail.com>
>>>> wrote:
>>>>
>>>> > Steve,
>>>> >
>>>> > Yes, we run freshclam and then clamscan once each day at 00:03 UTC.
>>>> There
>>>> > were many days of tmp directories. We ran the freshclam utility by
>>>> hand
>>>> > yesterday, on the instance the logs are from, at about 22:00 UTC, and
>>>> it
>>>> > completed the download. The subsequent update at 00:03 this morning
>>>> > completed successfully as well.
>>>> >
>>>> > The version is the package install on Ubuntu of clamav and
>>>> > clamav-freshclam: 0.99.2+addedllvm-0ubuntu0.14.04.1.
>>>> >
>>>> > Thanks!
>>>> >
>>>> > David
>>>> >
>>>> > On Tue, Jun 20, 2017 at 11:03 AM, Steven Morgan <
>>>> smor...@sourcefire.com>
>>>> > wrote:
>>>> >
>>>> > > David,
>>>> > >
>>>> > > So freshclam runs every day at ~00:03:00, and to confirm, the temp
>>>> > > directories/files are left for each of these runs?
>>>> > >
>>>> > > Which version of ClamAV are you using?
>>>> > >
>>>> > > Steve
>>>> > >
>>>> > > On Tue, Jun 20, 2017 at 7:51 AM, David Pullman <
>>>> david.pull...@gmail.com>
>>>> > > wrote:
>>>> > >
>>>> > > > Hi Steve,
>>>> > > >
>>>> > > > I've gathered some logs from one of the servers that had a bunch
>>>> of the
>>>> > > > clamor-nnnnnnnnnn.tmp directories over a number of days. I've
>>>> > aggregated
>>>> > > > seven days of them below (we rotate the log daily). We run
>>>> freshclam
>>>> > from
>>>> > > > cron each day.
>>>> > > >
>>>> > > > Please let me know if there's any suggestion on how I can get a
>>>> > > definitive
>>>> > > > reason for this, or correcting this? We have two issues, one is of
>>>> > course
>>>> > > > that the sigs are not updated, but also on some of the smaller
>>>> > instances
>>>> > > > the disk space is affected by the tmp files left in
>>>> /var/lib/clamav.
>>>> > > >
>>>> > > > Thanks very much for any suggestions or help!
>>>> > > >
>>>> > > > Tue Jun 13 00:03:01 2017 -> ------------------------------
>>>> --------
>>>> > > > Tue Jun 13 00:03:01 2017 -> ClamAV update process started at Tue
>>>> Jun 13
>>>> > > > 00:03:01 2017
>>>> > > > Tue Jun 13 00:03:01 2017 -> main.cld is up to date (version: 58,
>>>> sigs:
>>>> > > > 4566249, f-level: 60, builder: sigmgr)
>>>> > > > Tue Jun 13 00:03:09 2017 -> Downloading daily-23452.cdiff [100%]
>>>> > > > Tue Jun 13 00:03:10 2017 -> Downloading daily-23453.cdiff [100%]
>>>> > > > Tue Jun 13 00:03:13 2017 -> Downloading daily-23454.cdiff [100%]
>>>> > > > Wed Jun 14 00:03:02 2017 -> ------------------------------
>>>> --------
>>>> > > > Wed Jun 14 00:03:02 2017 -> ClamAV update process started at Wed
>>>> Jun 14
>>>> > > > 00:03:02 2017
>>>> > > > Wed Jun 14 00:03:02 2017 -> main.cld is up to date (version: 58,
>>>> sigs:
>>>> > > > 4566249, f-level: 60, builder: sigmgr)
>>>> > > > Wed Jun 14 00:03:38 2017 -> nonblock_connect: connect timing out
>>>> (30
>>>> > > secs)
>>>> > > >
>>>> > > >
>>>> > > _______________________________________________
>>>> > > clamav-users mailing list
>>>> > > clamav-users@lists.clamav.net
>>>> > > http://lists.clamav.net/cgi-bin/mailman/listinfo/clamav-users
>>>> > >
>>>> > >
>>>> > > Help us build a comprehensive ClamAV guide:
>>>> > > https://github.com/vrtadmin/clamav-faq
>>>> > >
>>>> > > http://www.clamav.net/contact.html#ml
>>>> > >
>>>> > _______________________________________________
>>>> > clamav-users mailing list
>>>> > clamav-users@lists.clamav.net
>>>> > http://lists.clamav.net/cgi-bin/mailman/listinfo/clamav-users
>>>> >
>>>> >
>>>> > Help us build a comprehensive ClamAV guide:
>>>> > https://github.com/vrtadmin/clamav-faq
>>>> >
>>>> > http://www.clamav.net/contact.html#ml
>>>> >
>>>> _______________________________________________
>>>> clamav-users mailing list
>>>> clamav-users@lists.clamav.net
>>>> http://lists.clamav.net/cgi-bin/mailman/listinfo/clamav-users
>>>>
>>>>
>>>> Help us build a comprehensive ClamAV guide:
>>>> https://github.com/vrtadmin/clamav-faq
>>>>
>>>> http://www.clamav.net/contact.html#ml
>>>>
>>>
>>>
>>
>
_______________________________________________
clamav-users mailing list
clamav-users@lists.clamav.net
http://lists.clamav.net/cgi-bin/mailman/listinfo/clamav-users


Help us build a comprehensive ClamAV guide:
https://github.com/vrtadmin/clamav-faq

http://www.clamav.net/contact.html#ml

Reply via email to