I'm trying to log multiple ACLs to syslog using squid 3.3.8 on CentOS 7
When I add the line :
access_log syslog:local4.info squid ACL1 ACL2 ACL3
Nothing gets logged to syslog
However when I change the line to :
access_log syslog:local4.info squid ACL1
It logs correctly, but just for ACL1 (as
If you are blocking it, then it can't be uploading 2G? How are you
measuring that it uploads 2G? Did you change squid's logging to support
that (it doesn't log upload sizes - only download sizes by default). Are
you simply referring to the Content-Length header - as that would say 2G -
even if the
I am blocking grove.microsoft.com. Even though I am blocking it, I am
seeing large, 2 Gig, uploads from the client to the proxy (which indeed
blocks it). It is almost like the connection request (explicit) contains
the 2 gig post request. Why is this happening? Has anyone seen this?
Michael
--
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Strange:
connect directly from server via wget using proxy is works:
root @ cthulhu /tmp # wget -S https://cloudflare.com
- --2016-04-15 02:19:41-- https://cloudflare.com/
Connecting to 127.0.0.1:3128... connected.
Proxy request sent, awaiting r
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Finally.
1. Squid 4 can be built with LibreSSL.
2. Squid 4 with LibreSSL start supporting CHACHA20_POLY1305 cryptography.
3. Squid 4 with LibreSSL still can't connect with CloudFlare itself.
WBR, Yuri.
PS. I suggests bug in 4.x branch specific f
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Ok, nobody.
Well.
I've done my own research.
My suggestions:
CloudFlare now uses it's own custom OpenSSL 1.0.2 with very custom
patches with CHACHA Poly support.
This patches is not in upstream. Moreover, OpenSSL team no plans in the
foreseeab
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Any ideas?
Anybody?
13.04.16 2:37, Yuri Voinov пишет:
>
> I suggests the matter can be openssl not OS:
>
> root @ cthulhu /patch # openssl version -a
> OpenSSL 1.0.1s 1 Mar 2016
> built on: Tue Mar 1 15:42:26 2016
> platform: solaris64-x86_64-c
On 15/04/2016 1:18 a.m., Muhammad Faisal wrote:
> Hi Amos,
> As you mentioned "Better to Store-ID cache the thing its Location header
> is pointing to." The problem is Location header has random strings in
> the URL that caused unique URL for the same object.
> Location:
> http://fs37.filehippo.com
Hi Amos,
As you mentioned "Better to Store-ID cache the thing its Location header
is pointing to." The problem is Location header has random strings in
the URL that caused unique URL for the same object.
Location:
http://fs37.filehippo.com/9546/46cfd241f1da4ae9812f512f7b36643c/vlc-2.2.2-win64.e
Thanks i will keep grinding on other websites. Currently working on
streaming videos to be served from Cache. I'm a bit confuse on cache hit
reason why its miss is it because of 206 or some other reason:
TCP_MISS/206 3874196 GET
http://cw002.foo.net/files/videos/2015/12/30/145148227265e28-360.
On 14/04/2016 9:32 p.m., Muhammad Faisal wrote:
> Thanks Amos for a detailed response.
> Well for Squid we are redirecting only HTTP traffic from policy routing.
> The object is unique which is being served to clients but due to
> different redirection of every user a new object is stored.
>
> Wha
Thanks for the answer Amos...
2016-04-14 11:14 GMT+03:00 Amos Jeffries [via Squid Web Proxy Cache] <
ml-node+s1019090n4677079...@n4.nabble.com>:
> On 14/04/2016 5:03 p.m., rozi wrote:
>
> > Hi
> >
> > trying to set a splash page that popup once a day for the clients here
> is my
> > conf:
> >
> >
Hi,
im seeing the following in cache.log after upgrade:
Squid Cache: Version 3.5.16-20160412-r14025
2016/04/14 14:47:01 kid1| varyEvaluateMatch: Oops. Not a Vary match on
second attempt, 'http://connect.facebook.net/en_US/sdk.js'
'accept-encoding="gzip,%20deflate,%20sdch"'
2016/04/14 14:47:01
Thanks Amos for a detailed response.
Well for Squid we are redirecting only HTTP traffic from policy routing.
The object is unique which is being served to clients but due to
different redirection of every user a new object is stored.
What about http streaming content having 206 response code
On 14/04/2016 8:03 p.m., Muhammad Faisal wrote:
> Hi,
> I'm trying to deal with dynamic content to be cached by Squid 3.5 (i
> tried many other version of squid e.g 2.7, 3.1, 3.4). By Dynamic I mean
> the URL for the actual content is always change this results in the
> wastage of Cache storage and
okay Thanks Amos. Im compiling the latest snapshot
squid-3.5.16-20160412-r14025 let see.
--
Regards,
Faisal.
-- Original Message --
From: "Amos Jeffries"
To: squid-users@lists.squid-cache.org
Sent: 4/14/2016 1:54:20 PM
Subject: Re: [squid-users] Squid 3.5 no traffic saving
On 14/04/
On 14/04/2016 7:51 p.m., Muhammad Faisal wrote:
> Hi Amos,
> The regression is fixed in the latest snapshot?
The Vary regression is, yes.
But I'm not clear on what your 'regression' was exactly. So you will
have to test and see.
Amos
___
squid-users m
On 14/04/2016 5:03 p.m., rozi wrote:
> Hi
>
> trying to set a splash page that popup once a day for the clients here is my
> conf:
>
> external_acl_type splash_page concurrency=100 ttl=10 %SRC
> /usr/lib/squid3/ext_session_acl -a -T 60 -b /home/e987654654/sessions.db
> acl existing_users external
Hi,
I'm trying to deal with dynamic content to be cached by Squid 3.5 (i
tried many other version of squid e.g 2.7, 3.1, 3.4). By Dynamic I mean
the URL for the actual content is always change this results in the
wastage of Cache storage and low hit rate. As per my understanding I
have two cha
Hi Amos,
The regression is fixed in the latest snapshot?
--
Regards,
Faisal.
On 9/04/2016 9:06 p.m., Muhammad Faisal wrote:
> Hi,
> I have deployed squid 3.5.16 as transparent proxy. I'm using squid
store
> ID help for CDN content caching despite all efforts i dont see any
> traffic saving on u
20 matches
Mail list logo