- Original Message -
From: Alex Rousskov
>
> The peer at 10.215.144.21:443 accepted Squid connection and then closed
> it, probably before sending anything to Squid
Thanks Alex.
I was lucky enough to try the following options in cache_peer:
ssloptions=NO_SSLv3,NO_SSLv2,NO_TLSv1_2,NO
On 01/24/2017 02:11 PM, Yuri Voinov wrote:
> 25.01.2017 2:50, Alex Rousskov пишет:
>> A short-term hack: I have seen folks successfully solving somewhat
>> similar problems using a localport ACL with an "impossible" value of
>> zero. Please try this hack and update this thread if it works for you:
25.01.2017 2:50, Alex Rousskov пишет:
> On 01/24/2017 12:20 PM, Yuri Voinov wrote:
>> 25.01.2017 1:10, Alex Rousskov пишет:
>>> On 01/24/2017 11:33 AM, Yuri Voinov wrote:
http_access deny to_localhost
>>> Does not match. The destination is not localhost.
>> Yes, destination is squid itself.
On 01/24/2017 12:20 PM, Yuri Voinov wrote:
> 25.01.2017 1:10, Alex Rousskov пишет:
>> On 01/24/2017 11:33 AM, Yuri Voinov wrote:
>>> http_access deny to_localhost
>> Does not match. The destination is not localhost.
> Yes, destination is squid itself. From squid to squid.
No, not "to squid": Th
On my setup it is easy to reproduce.
It is enough to execute with wget:
wget -S https://yandex.com/company/
access.log immediately shows
0 - TCP_DENIED/403 3574 GET http://repository.certum.pl/ca.cer -
HIER_NONE/- text/html;charset=utf-8
before request to Yandex destination.
However it execut
Under detailed ACL debug got this transaction:
2017/01/25 01:36:35.772 kid1| 28,3| DomainData.cc(110) match:
aclMatchDomainList: checking 'repository.certum.pl'
2017/01/25 01:36:35.772 kid1| 28,3| DomainData.cc(115) match:
aclMatchDomainList: 'repository.certum.pl' NOT found
2017/01/25 01:36:35.77
25.01.2017 1:10, Alex Rousskov пишет:
> On 01/24/2017 11:33 AM, Yuri Voinov wrote:
>
>>> 1485279884.648 0 - TCP_DENIED/403 3574 GET
>>> http://repository.certum.pl/ca.cer - HIER_NONE/- text/html;charset=utf-8
>
>> http_access deny !Safe_ports
> Probably does not match -- 80 is a safe port.
>
On 01/24/2017 11:33 AM, Yuri Voinov wrote:
>> 1485279884.648 0 - TCP_DENIED/403 3574 GET
>> http://repository.certum.pl/ca.cer - HIER_NONE/- text/html;charset=utf-8
> http_access deny !Safe_ports
Probably does not match -- 80 is a safe port.
> # Instant messengers include
> include "/usr
On 01/24/2017 01:02 AM, Vieri wrote:
> 2017/01/24 07:58:57.076 kid1| 83,5| bio.cc(139) read: FD 18 read 0 <= 65535
The peer at 10.215.144.21:443 accepted Squid connection and then closed
it, probably before sending anything to Squid (you did not show enough
FD 18 history to confirm that with certa
This is working production server. I've checked configuration twice. See
no problem.
Here:
# -
# Access parameters
# -
# Deny requests to unsafe ports
http_access deny !Safe_ports
# Instant messengers include
include "/usr/
On 01/24/2017 11:19 AM, Yuri Voinov wrote:
> It is downloads directly via proxy from localhost:
> As I understand, downloader also access via localhost, right?
This is incorrect. Downloader does not have a concept of an HTTP client
which sends the request to Squid so "via localhost" or "via any
May be, this feature is mutually exclusive with
sslproxy_foreign_intermediate_certs option?
25.01.2017 0:19, Yuri Voinov пишет:
> Mm, hardly.
>
> It is downloads directly via proxy from localhost:
>
> root @ khorne /patch # http_proxy=localhost:3128 curl
> http://repository.certum.pl/ca.cer
>
Mm, hardly.
It is downloads directly via proxy from localhost:
root @ khorne /patch # http_proxy=localhost:3128 curl
http://repository.certum.pl/ca.cer
0
0>1 *H
0UPL1U
270611104639Z0>1o.10U Certum CA0
0 UPL1U
0 *H. z o.o.10U Certum CA0"0
AK°jk̘gŭ&_O
On 01/24/2017 10:48 AM, Yuri Voinov wrote:
> It seems 4.0.17 tries to download certs but gives deny somewhere.
> However, same URL with wget via same proxy works
> Why?
Most likely, your http_access or similar rules deny internal download
transactions but allow external ones. This is possible, fo
Hm. Another question.
It seems 4.0.17 tries to download certs:
1485279884.648 0 - TCP_DENIED/403 3574 GET
http://repository.certum.pl/ca.cer - HIER_NONE/- text/html;charset=utf-8
but gives deny somewhere.
However, same URL with wget via same proxy works:
root @ khorne /patch # wget -S htt
I just received the news from my team that squid is working at first but
when they restart the service, It doesn't work. Has anyone encountered
issues like that?
On Tue, Jan 24, 2017 at 12:56 AM, Amos Jeffries
wrote:
> On 24/01/2017 3:38 p.m., Mustafa Mohammad wrote:
> > By regression...I mean o
What TLS option. I don't know how to configure that?
On Tue, Jan 24, 2017 at 10:08 AM, Mustafa Mohammad <
mustafamohamma...@gmail.com> wrote:
> No, It is messaging with HTTPS. If I were to use splice and peek, do I
> need a self signed certificate or any type of certificate?
>
> On Tue, Jan 24, 2
No, It is messaging with HTTPS. If I were to use splice and peek, do I need
a self signed certificate or any type of certificate?
On Tue, Jan 24, 2017 at 12:56 AM, Amos Jeffries
wrote:
> On 24/01/2017 3:38 p.m., Mustafa Mohammad wrote:
> > By regression...I mean our QA testing server. Let me exp
Hi everyone,
I was wondering why some of visited pages are not being cached (I mean
"main" pages, like www.example.com). If I visit 50 pages only 10 will be
cached. Below text is from log files:
store.log:
1485272001.646 RELEASE -1 04F7FA9EAA7FE3D531A2224F4C7DDE5A 200
1485272011
-Message d'origine-
De : squid-users [mailto:squid-users-boun...@lists.squid-cache.org] De la part
de David Touzeau
Envoyé : mardi 24 janvier 2017 11:42
À : squid-users@lists.squid-cache.org
Objet : Re: [squid-users] [3.5.23]: mozilla.org failed using SSL transparent
SSL23_GET_SERVER_HE
Sorry for the noise, I was able to find the cause: we use "dstdomain"
ACLs and Squid does reverse lookups.
It seems that Cloudflare DNS servers do not respond to PTR requests, and
since Squid has the default "dns_timeout" value to 30 seconds...:
$ host www.wireshark.org
www.wireshark.org has
This is a different log trace from David's.
Here Squid is setting up a TUNNEL to the clients original dst-IP,
successfully. Any TLS funky stuff going on for this transaction is done
directly between server and client. Squid's only involvement is to peek at
the Hello messages and record them for i
Le 23/01/2017 à 23:41, Amos Jeffries a écrit :
> On 24/01/2017 3:58 a.m., FUSTE Emmanuel wrote:
>> All was carefully checked and nothing in my configuration (acl etc ...)
>> explain why Squid insist to do DNS requests for requests forwarded to
>> the peer(s).
>>
>
>> #bug #4575
>> url_rewrite_extr
On 24/01/2017 9:26 p.m., FredB wrote:
> Hello,
>
> FI, I'm reading some parts of code and I found two little spelling errors
>
Thanks. Applied.
Amos
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/
teh TCP :-D teh drama :-D
Nice shoot :-D
24.01.2017 14:26, FredB пишет:
> Hello,
>
> FI, I'm reading some parts of code and I found two little spelling errors
>
> FredB
>
> ---
>
> --- src/client_side.cc2016-10-09 21:58:01.0
Hello,
FI, I'm reading some parts of code and I found two little spelling errors
FredB
---
--- src/client_side.cc 2016-10-09 21:58:01.0 +0200
+++ src/client_side.cc 2016-12-14 10:57:12.915469723 +0100
@@ -2736,10 +2736,10 @@
- Original Message -
From: Amos Jeffries
>
> You could try with a newer Squid version since the bio.cc code might be
> making something else happen in 3.5.23. If that still fails the 4.0 beta
> has different logic and far better debug info in this area.
Hi again,
I'm still strugglin
27 matches
Mail list logo