Maybe you should use a tool that has been created for the only purpose of
filtering web sites.
Like, e2guardian, squidguard, etc
Fred
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users
Il 01/12/2014 09:57, Amos Jeffries ha scritto:
[...]
If you are importing a public list of domains to block please
investigate whether your list source supports squid dstdomain ACL
formats. The best lists provide files with Squid dstdomain format
(which is also almost identical to the rbldnsd '
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
On 28/11/2014 10:23 p.m., navari.lore...@gmail.com wrote:
> I saw that the error does not preclude the use of the lines over
> the 100. I have no problem with the CPU ( 7 % ) . Only I do not
> like to see " Warning"
>
The RE engine can scan for indiv
this is the error: WARNING: there are more than 100 regular expressions.
Consider using less REs or use rules without expressions like 'dstdomain'.
--
View this message in context:
http://squid-web-proxy-cache.1019090.n4.nabble.com/WARNING-there-are-more-than-100-regular-expressions-tp46685
tor 2014-11-27 klockan 01:59 -0800 skrev navari.lore...@gmail.com:
> "Consider using less REs ..." is not possible.
>
> if there is no other solution
> i will break the files in many files with less then 100 entries.
>
> Probably will have the same problem with black list.
How many REs do you n
I saw that the error does not preclude the use of the lines over the 100. I
have no problem with the CPU ( 7 % ) . Only I do not like to see " Warning"
--
View this message in context:
http://squid-web-proxy-cache.1019090.n4.nabble.com/WARNING-there-are-more-than-100-regular-expressions-tp46685
blocking facebook and twitter can be done with ACLs based on dstdomain.
they are much faster than REs.
Marcus
On 11/27/2014 10:01 AM, navari.lore...@gmail.com wrote:
ok
i don't intend to use REs for blacklisting but only for blocking some sites
like facebook twitter...
In the other file i have
On 27/11/14 07:59, navari.lore...@gmail.com wrote:
"Consider using less REs ..." is not possible.
so dont worry about this WARNING message. This is just a warning,
not an error. If you're aware that using lots of REs can hit hard on the
CPU usage, just go for it.
--
Atencios
ok
i don't intend to use REs for blacklisting but only for blocking some sites
like facebook twitter...
In the other file i have about 120 - 150 REs.
--
View this message in context:
http://squid-web-proxy-cache.1019090.n4.nabble.com/WARNING-there-are-more-than-100-regular-expressions-tp4668
how many REs do you have ?
and do you intend to use REs for blacklisting?
Marcus
On 11/27/2014 08:33 AM, Helmut Hullen wrote:
Hallo, navari.lore...@gmail.com,
Du meintest am 27.11.14:
"Consider using less REs ..." is not possible.
Then try something like "squidguard" with lots of user defi
Hallo, navari.lore...@gmail.com,
Du meintest am 27.11.14:
> "Consider using less REs ..." is not possible.
Then try something like "squidguard" with lots of user defined domains
and URLs.
Viele Gruesse!
Helmut
___
squid-users mailing list
squid-use
"Consider using less REs ..." is not possible.
if there is no other solution
i will break the files in many files with less then 100 entries.
Probably will have the same problem with black list.
Thank
--
View this message in context:
http://squid-web-proxy-cache.1019090.n4.nabble.com/WARNIN
Hallo, navari.lore...@gmail.com,
Du meintest am 27.11.14:
> i have these Warnings
> squid -k parse
> ..
> 2014/11/27 09:36:22| Processing: acl direct_urls dstdom_regex
> "/etc/squid/direct_urls.txt"
> 2014/11/27 09:36:22| /etc/squid/squid.conf line 86: acl direct_urls
> dstdom_regex
Good day,
i have these Warnings
squid -k parse
..
2014/11/27 09:36:22| Processing: acl direct_urls dstdom_regex
"/etc/squid/direct_urls.txt"
2014/11/27 09:36:22| /etc/squid/squid.conf line 86: acl direct_urls
dstdom_regex"/etc/squid/direct_urls.txt"
2014/11/27 09:36:22|
14 matches
Mail list logo