On 6/11/2016 7:52 a.m., Garri Djavadyan wrote:
> On 2016-11-05 23:10, konradka wrote:
>> Hi Garri,
>>
>> Thanks for your responses mate !
>>
>> I did not realize that the squid was compiled with proxy user. Well
>> spotted
>> !
>>
>> It looks like permission's issue but squid error message is not g
On 8/11/2016 3:40 p.m., L. A. Walsh wrote:
> Alex Rousskov wrote:
>> On 11/07/2016 11:59 AM, L. A. Walsh wrote:
>>>
>>>(71) Protocol error (TLS code: X509_V_ERR_SELF_SIGNED_CERT_IN_CHAIN)
>>>
>>>Self-signed SSL Certificate in chain: /C=US/O=Entrust, Inc./OU=See
>>> www.entrust.net/legal-ter
Alex Rousskov wrote:
On 11/07/2016 11:59 AM, L. A. Walsh wrote:
I have the SSL bump feature setup and so far have been happy with
it, but today, I got an error from a website,
You got an error from Squid, not a website.
saying they detect my
ability to monitor my webtraffic and refuse to a
Squid 4 still beta.
08.11.2016 1:41, Alex Rousskov пишет:
> On 11/07/2016 12:36 PM, Yuri Voinov wrote:
>> Squid can't do auto-downloading (autocomplete) certificate chains
> Squid v4 can do that since r14769 (included in v4.0.13).
>
> Alex.
>
--
Cats - delicious. You just do not know how to coo
On 11/07/2016 12:36 PM, Yuri Voinov wrote:
> Squid can't do auto-downloading (autocomplete) certificate chains
Squid v4 can do that since r14769 (included in v4.0.13).
Alex.
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.sq
It seems simple no intermediate certificate in chain.
Root CA bundle(s) usually does not contain all intermediate CA's,
because of browsers can simple download it from server/site.
Squid can't do auto-downloading (autocomplete) certificate chains and
require to confiugure sslproxy_foreign_interme
On 11/07/2016 11:59 AM, L. A. Walsh wrote:
> I have the SSL bump feature setup and so far have been happy with
> it, but today, I got an error from a website,
You got an error from Squid, not a website.
> saying they detect my
> ability to monitor my webtraffic and refuse to allow it:
Actually
I have the SSL bump feature setup and so far have been happy with
it, but today, I got an error from a website, saying they detect my
ability to monitor my webtraffic and refuse to allow it:
The following error was encountered while trying to retrieve the URL:
https://consumercomplaints.fcc.gov/
On 11/07/2016 03:19 AM, FredB wrote:
>
>> Use "login=PASS" (exact string) on the cache_peer.
>>
>> Along with an http_access check that uses an external ACL helper
>> which
>> produces "OK user=X password=Y" for whatever credentials need to be
>> sent.
>>
>> NP: on older Squid that may be "pass=
On 11/07/2016 08:48 AM, Daniel Dormont wrote:
> the request will return a 200 OK, and the response body
> will be a valid image, but it'll have this extra header:
>
> x-staticmap-api-warning:Failed to fetch image url
> http://download.example.com/mapicons/1
>
> This should be a transient error,
On 2016-11-07 20:11, Juan C. Crespo R. wrote:
Hi, Thanks for your response and help
1. Cache: Version 3.5.19
Service Name: squid
configure options: '--prefix=/usr/local/squid'
'--enable-storeio=rock,diskd,ufs,aufs'
'--enable-removal-policies=lru,heap' '--disable-pf-transparent'
'--enable-ipfw-
Hi,
My team uses Squid to proxy and cache certain external content in our
web platform. Now, we are looking to use it to cache images that come
from the Google Static Maps API. Sometimes, however, these images fail
to load properly, but unfortunately they do not respond with any of
the usual HTTP
Hi, Thanks for your response and help
1. Cache: Version 3.5.19
Service Name: squid
configure options: '--prefix=/usr/local/squid'
'--enable-storeio=rock,diskd,ufs,aufs'
'--enable-removal-policies=lru,heap' '--disable-pf-transparent'
'--enable-ipfw-transparent' '--with-large-files' '--enable-
On 7/11/2016 10:59 p.m., Antony Stone wrote:
> On Monday 07 November 2016 at 10:53:14, Bilal Mohamed wrote:
>
>> Hi,
>>
>> I am getting following error while accessing google. Rest all websites are
>> ok. There is no ACL to block google.com
The message is not "Access Denied" (ACLs).
It is "Netwo
On Mon, 2016-11-07 at 06:25 -0400, Juan C. Crespo R. wrote:
> Good Morning Guys
>
>
> I've been trying to make a few ACL to catch and then improve the
> BW
> of the HITS sent from my Squid Box to my CMTS and I can't find any
> way
> to doit
>
>
> Squid.conf: qos_flows tos local-hit=0x30
Good Morning Guys
I've been trying to make a few ACL to catch and then improve the BW
of the HITS sent from my Squid Box to my CMTS and I can't find any way
to doit
Squid.conf: qos_flows tos local-hit=0x30
Cisco CMTS: ip access-list extender JC
Int giga0/1
ip address 172.25.25.30 255
On Monday 07 November 2016 at 10:53:14, Bilal Mohamed wrote:
> Hi,
>
> I am getting following error while accessing google. Rest all websites are
> ok. There is no ACL to block google.com
Is your machine properly configured for IPv6?
Try the following:
ping www.google.com
ping
Hi,
I am getting following error while accessing google. Rest all websites are
ok. There is no ACL to block google.com
*ERROR*
*The requested URL could not be retrieved*
--
The following error was encountered while trying to retrieve the URL:
http://www.google.com/
> Use "login=PASS" (exact string) on the cache_peer.
>
> Along with an http_access check that uses an external ACL helper
> which
> produces "OK user=X password=Y" for whatever credentials need to be
> sent.
>
> NP: on older Squid that may be "pass=" instead of "password=".
>
> Amos
>
Ok tha
19 matches
Mail list logo