From: Amos Jeffries
> any special meaning (like doing a lookahead) is prevented.
OK, so I'll do an acl for deny and another for allow.
Thanks
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists
On 18/09/17 21:04, Antony Stone wrote:
On Monday 18 September 2017 at 09:43:12, Vieri wrote:
Hi,
I'd like to block access to URLs ending in *.dll except for those ending in
mriweb.dll.
acl denied_filetypes urlpath_regex -i denied.filetypes
where denied.filetypes contains a list of expression
On Monday 18 September 2017 at 09:43:12, Vieri wrote:
> Hi,
>
> I'd like to block access to URLs ending in *.dll except for those ending in
> mriweb.dll.
>
> acl denied_filetypes urlpath_regex -i denied.filetypes
>
> where denied.filetypes contains a list of expressions
Are the others working?
Hi,
I'd like to block access to URLs ending in *.dll except for those ending in
mriweb.dll.
acl denied_filetypes urlpath_regex -i denied.filetypes
where denied.filetypes contains a list of expressions of which:
(\?!mriweb\.dll$).*\.dll$
This doesn't seem to work if I try to deny access.
eg. a
On 19/01/2016 6:56 a.m., Lucía Guevgeozian wrote:
> Thank you very much for your responses.
>
> I understand from http://www.squid-cache.org/Doc/config/http_access/ that
> http_access will not work with https in version of squid older than 3.3.
Incorrect. http_access works with any HTTP message g
On Monday 18 January 2016 at 19:43:56, Jorgeley Junior wrote:
> I didn't test this, but i think it works better:
> *http_access deny banned_sites !good_facebook*
> is it works?
That would work, yes, but:
- it's not as obvious as putting two lines one after the other
- this is only an exampl
Hi, unfortunately I tried that already and in 3.0 version I can say it
didn't work.
cheers
2016-01-18 15:43 GMT-03:00 Jorgeley Junior :
> I didn't test this, but i think it works better:
> *http_access deny banned_sites !good_facebook*
> is it works?
>
> 2016-01-18 16:35 GMT-02:00 Lucía Guevge
I didn't test this, but i think it works better:
*http_access deny banned_sites !good_facebook*
is it works?
2016-01-18 16:35 GMT-02:00 Lucía Guevgeozian :
> Ok, thanks again for the quick reply, I'm upgrading :)
>
> Regards,
> Lucia
>
> 2016-01-18 14:58 GMT-03:00 Yuri Voinov :
>
>>
>> -BEG
Ok, thanks again for the quick reply, I'm upgrading :)
Regards,
Lucia
2016-01-18 14:58 GMT-03:00 Yuri Voinov :
>
> -BEGIN PGP SIGNED MESSAGE-
> Hash: SHA256
>
>
>
> 18.01.16 23:56, Lucía Guevgeozian пишет:
> > Thank you very much for your responses.
> >
> > I understand from http://www.s
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
18.01.16 23:56, Lucía Guevgeozian пишет:
> Thank you very much for your responses.
>
> I understand from http://www.squid-cache.org/Doc/config/http_access/ that
> http_access will not work with https in version of squid older than 3.3.
>
> Do you
Thank you very much for your responses.
I understand from http://www.squid-cache.org/Doc/config/http_access/ that
http_access will not work with https in version of squid older than 3.3.
Do you know if an alternative config exists without upgrading?
Regards,
Lucia
2016-01-18 14:38 GMT-03:00 Ant
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
18.01.16 23:38, Antony Stone пишет:
> On Monday 18 January 2016 at 18:31:40, Yuri Voinov wrote:
>
>> Facebook (like more others) uses Akamai CDN as background delivery
service.
>>
>> So, facebook.* domain is a little part of whole big fat Faceboo
On Monday 18 January 2016 at 18:31:40, Yuri Voinov wrote:
> Facebook (like more others) uses Akamai CDN as background delivery service.
>
> So, facebook.* domain is a little part of whole big fat Facebook :)
True, but that should still match *request* URLs (once the HTTP/S problem is
sorted out
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
And more:
Facebook (like more others) uses Akamai CDN as background delivery service.
So, facebook.* domain is a little part of whole big fat Facebook :)
18.01.16 23:29, Antony Stone пишет:
> On Monday 18 January 2016 at 18:22:24, Lucía Guevgeoz
On Monday 18 January 2016 at 18:22:24, Lucía Guevgeozian wrote:
> acl good_facebook urlpath_regex groups
> acl banned_sites url_regex "/etc/squid/config/banned_sites"
>
> inside banned_sites I have the word facebook
>
> http_access allow good_facebook
> http_access deny banned_sites
Okay, so yo
Hello,
I think I have a very basic question about acl, but I can't figure out why
this simple config is not working:
In my squid.conf file I have 2 acl
acl good_facebook urlpath_regex groups
acl banned_sites url_regex "/etc/squid/config/banned_sites"
inside banned_sites I have the word facebook
On Thursday 12 March 2015 at 12:46:36 (EU time), James Harper wrote:
> > Ah. That is a bug then. The -i bit is not supposed to be treated as a
> > pattern.
>
> Even when I put it in []'s? I think the mistake was mine.
There was no [] in your original posting of your conf file...
On Thursday 12
> >
> > Found it. Really stupid mistake. The documentation shows [-i] for
> > case insensitivity, but I hadn't picked up that the [] around the -i
> > indicated that it was optional. I had just cut and pasted from
> > examples. So the .cab thing was irrelevant - it just happened that
> > the .cab f
On 13/03/2015 12:30 a.m., James Harper wrote:
>>
>> I also tried the same thing with http_access and that works as expected -
>> *.psf files are allowed, non *.psf file are denied. I'm thinking bug at the
>> point... I'll do some more testing and see if I can narrow it doen.
>>
>
> Found it. Reall
>
> I also tried the same thing with http_access and that works as expected -
> *.psf files are allowed, non *.psf file are denied. I'm thinking bug at the
> point... I'll do some more testing and see if I can narrow it doen.
>
Found it. Really stupid mistake. The documentation shows [-i] for ca
> Three things;
>
> * by re-writing you are generating an entirely new request with the
> apt-cacher server URL as the destination. The HTTP message details about
> what was originally requested and from where is *gone* when the traffic
> leaves for the server. The solution for that is outlined at
On 12/03/2015 9:14 p.m., James Harper wrote:
> I have just noticed that urlpath_regex isn't doing what I want:
>
> acl wuau_repo dstdomain .download.windowsupdate.com
> acl wuau_path urlpath_regex -i \.psf$
> acl dst_server dstdomain server
> acl apt_cacher browser apt-cacher
>
> cache deny dst_s
I have just noticed that urlpath_regex isn't doing what I want:
acl wuau_repo dstdomain .download.windowsupdate.com
acl wuau_path urlpath_regex -i \.psf$
acl dst_server dstdomain server
acl apt_cacher browser apt-cacher
cache deny dst_server
cache deny apt_cacher
cache deny wuau_repo
cache allow
23 matches
Mail list logo