On Wed, 2014-08-27 at 03:01 +0200, Reindl Harald wrote:
> > If it's internal, it's internal. There is a reason you are setting up
> > lastexternal DNSxL rules.
>
> the intention is to handle the internal IP like it would be external
Again: Craft your samples to match real-life (production) enviro
Am 27.08.2014 um 02:24 schrieb Karsten Bräckelmann:
> On Wed, 2014-08-27 at 01:08 +0200, Reindl Harald wrote:
>> below the stdout/sterr of following script filtered for "dns"
>> so the lists are asked, but the question remains why that
>> don't happen from a IP in the same network
>
> Nope, no R
On Wed, 2014-08-27 at 01:08 +0200, Reindl Harald wrote:
> below the stdout/sterr of following script filtered for "dns"
> so the lists are asked, but the question remains why that
> don't happen from a IP in the same network
Nope, no RBL queries. See below.
> in the meantime there are a lot of "c
Am 26.08.2014 um 22:23 schrieb Matthias Leisi:
> On Tue, Aug 26, 2014 at 9:25 PM, Reindl Harald wrote:
>
>>> spamc -your_normal_spamc_options >
>> are we really talking about the same?
>> that won't involve the network
>
> You need a full message, include any Received: etc headers, as it
> woul
On Tue, Aug 26, 2014 at 9:25 PM, Reindl Harald wrote:
>> spamc -your_normal_spamc_options
> are we really talking about the same?
> that won't involve the network
You need a full message, include any Received: etc headers, as it
would appear on your MTA when it would pass it on to spamc (or
wha
On Tue, 2014-08-26 at 21:25 +0200, Reindl Harald wrote:
> Am 26.08.2014 um 21:08 schrieb Martin Gregorie:
> > Under the same directory as spamass-milter run:
> >
> > spamc -your_normal_spamc_options
> are we really talking about the same?
> that won't involve the network
>
Of course it will.
On Tue, 2014-08-26 at 11:22 -0400, Kris Deugau wrote:
> Is there a way to prevent a URI from being looked up in DNSBLs, without
> *also* preventing that URI from matching on uri regex rules?
>
> I would like to add quite a few popular URL shorteners to
> uridnsbl_skip_domain, but then I can't matc
Am 26.08.2014 um 21:08 schrieb Martin Gregorie:
> On Tue, 2014-08-26 at 20:08 +0200, Reindl Harald wrote:
>> Am 26.08.2014 um 18:11 schrieb Axb:
>>> On 08/26/2014 05:42 PM, Reindl Harald wrote:
they are *not* i sepecially added the following lines
to prevent the automatic adding to "trus
Le 26/08/2014 21:03, Reindl Harald a écrit :
i just don't know how to do that with the setup and mailflow
by just start "spamassassin -D dns" which runs the process
but how to get the mail there?
You need a copy of the message as a text file on your SA machine, then
you simply run, from the co
On Tue, 2014-08-26 at 20:08 +0200, Reindl Harald wrote:
>
> Am 26.08.2014 um 18:11 schrieb Axb:
> > On 08/26/2014 05:42 PM, Reindl Harald wrote:
> >> they are *not* i sepecially added the following lines
> >> to prevent the automatic adding to "trusted_networks"
> >> since the IP range is outside
Am 26.08.2014 um 20:29 schrieb Axb:
> On 08/26/2014 08:08 PM, Reindl Harald wrote:
>>
>> Am 26.08.2014 um 18:11 schrieb Axb:
>>> On 08/26/2014 05:42 PM, Reindl Harald wrote:
they are *not* i sepecially added the following lines
to prevent the automatic adding to "trusted_networks"
On 08/26/2014 08:08 PM, Reindl Harald wrote:
Am 26.08.2014 um 18:11 schrieb Axb:
On 08/26/2014 05:42 PM, Reindl Harald wrote:
they are *not* i sepecially added the following lines
to prevent the automatic adding to "trusted_networks"
since the IP range is outside
clear_trusted_networks
trust
Am 26.08.2014 um 18:11 schrieb Axb:
> On 08/26/2014 05:42 PM, Reindl Harald wrote:
>> they are *not* i sepecially added the following lines
>> to prevent the automatic adding to "trusted_networks"
>> since the IP range is outside
>>
>> clear_trusted_networks
>> trusted_networks 192.168.168.0/24
>
On 08/26/2014 05:42 PM, Reindl Harald wrote:
they are*not* i sepecially added the following lines
to prevent the automatic adding to "trusted_networks"
since the IP range is outside
clear_trusted_networks
trusted_networks 192.168.168.0/24
there was no trust at all in the headers and no
hint wh
Am 26.08.2014 um 17:30 schrieb Axb:
> On 08/26/2014 05:25 PM, Reindl Harald wrote:
>> Am 26.08.2014 um 17:18 schrieb Axb:
>>> On 08/26/2014 04:28 PM, Reindl Harald wrote:
header RCVD_IN_RP_TLDNS1 eval:check_rbl('tldns1-lastexternal',
'dnswl.thelounge.net.')
describe RCVD_IN_RP_TL
Am 26.08.2014 um 17:18 schrieb Axb:
> On 08/26/2014 04:28 PM, Reindl Harald wrote:
>> header RCVD_IN_RP_TLDNS1 eval:check_rbl('tldns1-lastexternal',
>> 'dnswl.thelounge.net.')
>> describe RCVD_IN_RP_TLDNS1 Custom DNSBL/DNSWL
>> tflags RCVD_IN_RP_TLDNS1 net
>> scoreRCVD_IN_RP_TLDNS1 -5
>>
On 08/26/2014 05:25 PM, Reindl Harald wrote:
Am 26.08.2014 um 17:18 schrieb Axb:
On 08/26/2014 04:28 PM, Reindl Harald wrote:
header RCVD_IN_RP_TLDNS1 eval:check_rbl('tldns1-lastexternal',
'dnswl.thelounge.net.')
describe RCVD_IN_RP_TLDNS1 Custom DNSBL/DNSWL
tflags RCVD_IN_RP_TLDNS1 net
s
Is there a way to prevent a URI from being looked up in DNSBLs, without
*also* preventing that URI from matching on uri regex rules?
I would like to add quite a few popular URL shorteners to
uridnsbl_skip_domain, but then I can't match those domains in uri regex
rules for feeding "x and URL shorte
On 08/26/2014 04:28 PM, Reindl Harald wrote:
Am 26.08.2014 um 15:54 schrieb Axb:
On 08/26/2014 03:00 PM, Reindl Harald wrote:
Am 26.08.2014 um 14:25 schrieb Joe Quinn:
On 8/26/2014 8:04 AM, Reindl Harald wrote:
sadly the Wiki don't refer to check_rbl()
https://wiki.apache.org/spamassassin/Wr
>
> From: Ian Zimmerman
> Sent: Monday, August 25, 2014 5:02 PM
> To: users@spamassassin.apache.org
> Subject: Re: drop of score after update tonight
> On Mon, 25 Aug 2014 19:50:20 +,
> David Jones wrote:
> Ian> I definitely have FNs today (about 10
Am 26.08.2014 um 15:54 schrieb Axb:
> On 08/26/2014 03:00 PM, Reindl Harald wrote:
>> Am 26.08.2014 um 14:25 schrieb Joe Quinn:
>>> On 8/26/2014 8:04 AM, Reindl Harald wrote:
sadly the Wiki don't refer to check_rbl()
https://wiki.apache.org/spamassassin/WritingRules
>>> You can use
On 08/26/2014 03:00 PM, Reindl Harald wrote:
Am 26.08.2014 um 14:25 schrieb Joe Quinn:
On 8/26/2014 8:04 AM, Reindl Harald wrote:
i am tyring to write own RBL rules for blacklisting and
especially whitelisting using internal DNSBL/DNSWL but
my first try results in warnings at startup
sadly t
Am 26.08.2014 um 14:25 schrieb Joe Quinn:
> On 8/26/2014 8:04 AM, Reindl Harald wrote:
>> i am tyring to write own RBL rules for blacklisting and
>> especially whitelisting using internal DNSBL/DNSWL but
>> my first try results in warnings at startup
>>
>> sadly the Wiki don't refer to check_rbl(
On 8/26/2014 8:04 AM, Reindl Harald wrote:
Hi
i am tyring to write own RBL rules for blacklisting and
especially whitelisting using internal DNSBL/DNSWL but
my first try results in warnings at startup
sadly the Wiki don't refer to check_rbl()
https://wiki.apache.org/spamassassin/WritingRules
i
Hi
i am tyring to write own RBL rules for blacklisting and
especially whitelisting using internal DNSBL/DNSWL but
my first try results in warnings at startup
sadly the Wiki don't refer to check_rbl()
https://wiki.apache.org/spamassassin/WritingRules
ifplugin Mail::SpamAssassin::Plugin::DNSEval
Am 26.08.2014 um 11:30 schrieb Axb:
> On 08/26/2014 11:23 AM, Reindl Harald wrote:
>> i am at building the new MTA which will replace a commercial
>> spamfilter appliance and currently i am at training byes and
>> building admin backends
>>
>> * postscreen with RBL/DNSWL weight
>> * PTR filters
>
On 08/26/2014 11:23 AM, Reindl Harald wrote:
i am at building the new MTA which will replace a commercial
spamfilter appliance and currently i am at training byes and
building admin backends
* postscreen with RBL/DNSWL weight
* PTR filters
* subject filters
* attachemnt extensions
* ClamAV milte
Am 26.08.2014 um 10:52 schrieb Matthias Leisi:
> On Tue, Aug 26, 2014 at 10:16 AM, Reindl Harald
> wrote:
>
> ADVANCE_FEE_4_NEW,ADVANCE_FEE_4_NEW_MONEY,ADVANCE_FEE_5_NEW,ADVANCE_FEE_5_NEW_MONEY,ALL_TRUSTED,BAYES_99,BAYES_999,DEAR_SOMETHING,DKIM_ADSP_CUSTOM_MED,FREEMAIL_FROM,LOTS_OF_MONEY,T_MON
On Tue, Aug 26, 2014 at 10:16 AM, Reindl Harald wrote:
ADVANCE_FEE_4_NEW,ADVANCE_FEE_4_NEW_MONEY,ADVANCE_FEE_5_NEW,ADVANCE_FEE_5_NEW_MONEY,ALL_TRUSTED,BAYES_99,BAYES_999,DEAR_SOMETHING,DKIM_ADSP_CUSTOM_MED,FREEMAIL_FROM,LOTS_OF_MONEY,T_MONEY_PERCENT,URG_BIZ
>>> scantime=0.3,size=4760,user=sa-milt
found it - look at the bottom
the other thread where i try to find out why spam messages don't
get [SPAM] in the subject (still unsolved) turned out that
"sa-update" obviously changed the permissions of the folder
"updates_spamassassin_org" to 750 instead 755
after fixing that it is again above 7
Am 26.08.2014 09:30, schrieb Ian Zimmerman:
Apparently not. So, I have to rephrase: Isn't it a bit odd to use
these external rules? :)
No, I don't think that its odd to use other statistical filters than the
SA Bayes.
CRM114 uses a completely different algorithem, building statistics not
j
Am 26.08.2014 um 08:54 schrieb Matthias Leisi:
> On Tue, Aug 26, 2014 at 12:08 AM, Reindl Harald
> wrote:
>
>> Aug 26 00:01:32 mail-gw spamd[6836]: spamd: result: Y 5 -
>> ADVANCE_FEE_4_NEW,ADVANCE_FEE_4_NEW_MONEY,ADVANCE_FEE_5_NEW,ADVANCE_FEE_5_NEW_MONEY,ALL_TRUSTED,BAYES_99,BAYES_999,DEAR_SOM
On Tue, 26 Aug 2014 08:10:23 +0200,
Matus UHLAR - fantomas wrote:
Ian> Isn't it a bit odd that SA has rules for all these other Bayes
Ian> powered backends? Why not give a bit more weight to its own Bayes
Ian> instead, rather than make users forage for other tools that do
Ian> essentially the sa
33 matches
Mail list logo