On 12/01/20 2:04 am, user wrote:
> Hello.
> I have use squid 4.8 as reverse proxy. The problem is remote (or local?)
> side close connection every 2-4 minutes with message "TCP_MISS_ABORTED/200"
> in log.
That log tag is normal for traffic with Happy Eyeballs operating.
Without extra information t
On 16/01/20 5:43 am, Farid Agha wrote:
> Hi all,
>
> After reading archives and FAQ, I send this email.
>
> My main goal is to cache segments of a live video. The content-type is
> video/MP2T and the extension is .ts
>
> I'm actually in explicit proxy and the cache is working fine for all
> fil
On Sun, Jan 19, 2020 at 10:22:56PM +1300, Amos Jeffries wrote:
>
> It should work without the anchor and suffix. Perhapse the URL is not
> actually that string?
>
Yes it will work without the anchor but using an anchor can make the
regex faster because it then does not have to scan the whole str
thanks Amos
#allow special URL paths
acl special_url url_regex "/usr/local/squid/etc/urlspecial.txt"
http_access allow special_url
#
#deny MIME types
acl mimetype rep_mime_type "/usr/local/squid/etc/mimedeny.txt"
http_reply_access allow special_url
http_reply_access deny mimetype
the reason why i
On 16/01/20 9:30 pm, Robert Marshall wrote:
> Hi all,
>
> I'm trying to set up a transparent proxy on my network so that all
> devices are forced to use Squid/SquidGuard for network traffic, and can
> filter out undesirable destinations.
>
> I have Squid/SquidGuard running on a Raspberry Pi 4, ru
On 18/01/20 3:51 am, robert k Wild wrote:
> smashed it -
>
> acl special_url url_regex ^http://updater.maxon.net/server_test.*
> http_access allow special_url
>
It should work without the anchor and suffix. Perhapse the URL is not
actually that string?
* any (.*) at beginning or end is assume