________________________________ From: Amos Jeffries <squ...@treenet.co.nz> > >> The following works: >> >> acl denied_useragent browser Chrome >> acl denied_useragent browser MSIE >> acl denied_useragent browser Opera >> acl denied_useragent browser Trident >> [...] >> http_access deny denied_useragent >> http_reply_access deny denied_useragent >> deny_info >> http://proxy-server1/proxy-error/?a=%a&B=%B&e=%e&E=%E&H=%H&i=%i&M=%M&o=%o&R=%R&T=%T&U=%U&u=%u&w=%w&x=%x&acl=denied_useragent >> denied_useragent >> >> The following works for HTTP sites, but not for HTTPS sites in an ssl-bumped >> setup: >> >> acl allowed_useragent browser Firefox/ >> [...] >> http_access deny !allowed_useragent >> deny_info >> http://proxy-server1/proxy-error/?a=%a&B=%B&e=%e&E=%E&H=%H&i=%i&M=%M&o=%o&R=%R&T=%T&U=%U&u=%u&w=%w&x=%x&acl=allowed_useragent >> allowed_useragent >> > The User-Agent along with all HTTP layer details in HTTPS are hidden > behind the encryption layer. TO do anything with them you must decrypt > the traffic first. If you can decrypt it turns into regular HTTP traffic > - the normal access controls should then work as-is.
So why does my first example actually work even for https sites? acl denied_useragent browser Chrome acl denied_useragent browser MSIE acl denied_useragent browser Opera acl denied_useragent browser Trident [...] http_access deny denied_useragent http_reply_access deny denied_useragent deny_info http://proxy-server1/proxy-error/?a=%a&B=%B&e=%e&E=%E&H=%H&i=%i&M=%M&o=%o&R=%R&T=%T&U=%U&u=%u&w=%w&x=%x&acl=denied_useragent denied_useragent If the above "works" then another way would be to use a negated regular expression such as: acl denied_useragent browser (?!Firefox) but I don't think it's allowed. Vieri _______________________________________________ squid-users mailing list squid-users@lists.squid-cache.org http://lists.squid-cache.org/listinfo/squid-users