Re: Custom Rule to catch this

2007-03-08 Thread Loren Wilton
BTW I'm not sure it's necessary to escape the space character within the [square brackets] - I think it's acceptable to just have [ ] without the \ inside. Although it doesn't do any harm having it in there either... No need to escape the space. Even better to include both space and tab chara

Re: Custom Rule to catch this

2007-03-08 Thread Jeremy Fairbrass
Strange indeed - not for me - I'm using The Regex Coach from http://weitz.de/regex-coach/ which so far always does a perfect job of testing regex. Maybe it's wrong on this case - who knows! :) BTW I'm not sure it's necessary to escape the space character within the [square brackets] - I think i

Re: Custom Rule to catch this

2007-03-08 Thread kshatriyak
On Thu, 8 Mar 2007, Jeremy Fairbrass wrote: I just tested those three rules below, and none of them work with "www.superveils . com" (ie. having a space both before and after that dot). Strange, it matches rule 3 with egrep: echo 'www.superveils . com' | egrep 'www[\ ]+?\.([a-z0-9\-\ ]?)+\.[

Re: Custom Rule to catch this

2007-03-08 Thread Jeremy Fairbrass
I just tested those three rules below, and none of them work with "www.superveils . com" (ie. having a space both before and after that dot). You might want to try my version of this rule instead - it's attached to avoid line wraps. Works well for double-spaces in a URL (including on either si

Re: Custom Rule to catch this

2007-03-08 Thread Nigel Frankcom
On Thu, 8 Mar 2007 10:23:05 +0100, [EMAIL PROTECTED] wrote: >On Thu, 8 Mar 2007, [EMAIL PROTECTED] wrote: > >> I searched the list and found this rule to catch URL with single space >> (www.ledrx .com). Please help me in modifying this rule to catch URL >> with double space (www.superveils . com

Re: Custom Rule to catch this

2007-03-08 Thread kshatriyak
On Thu, 8 Mar 2007, [EMAIL PROTECTED] wrote: I searched the list and found this rule to catch URL with single space (www.ledrx .com). Please help me in modifying this rule to catch URL with double space (www.superveils . com). body URL_WITH_SPACE m/\bhttp:\/\/[a-z0-9\-.]+[!*%&, -]+\.?com\b/

Re: Custom Rule to catch this

2007-03-08 Thread Nigel Frankcom
gt; >body URL_WITH_SPACE m/\bhttp:\/\/[a-z0-9\-.]+[!*%&, -]+\.?com\b/ > >Thanks. > >-Original message- >From: David Goldsmith [EMAIL PROTECTED] >Date: Wed, 7 Mar 2007 11:57:21 -0500 >To: users@spamassassin.apache.org >Subject: Re: Custom Rule to catch t

Re: Custom Rule to catch this

2007-03-08 Thread spamassassin
From: David Goldsmith [EMAIL PROTECTED] Date: Wed, 7 Mar 2007 11:57:21 -0500 To: users@spamassassin.apache.org Subject: Re: Custom Rule to catch this > -BEGIN PGP SIGNED MESSAGE-> Hash: SHA1 > > [EMAIL PROTECTED] wrote: > > Does anyone have written a custom rule to catch t

Re: Custom Rule to catch this

2007-03-07 Thread David Goldsmith
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 [EMAIL PROTECTED] wrote: > Does anyone have written a custom rule to catch this spam? > > It would of great help. > > Thanks. > > -Original Message- > From: Carmella Boehm [mailto:[EMAIL PROTECTED] > Sent: Wed

Custom Rule to catch this

2007-03-07 Thread spamassassin
Does anyone have written a custom rule to catch this spam? It would of great help. Thanks. -Original Message- From: Carmella Boehm [mailto:[EMAIL PROTECTED] Sent: Wednesday, March 07, 2007 6:51 PM To: [EMAIL PROTECTED] Subject: Account Info Friendly Reminder; Get your desired