Stuart Ballard <[EMAIL PROTECTED]> writes:
> I am having problems with apt-get which I believe are related to the
> fact that my ISP runs a caching proxy server which seems to
> transparently intercept all traffic sent to port 80 of anywhere. The
> symptom is that the first request to any server works fine, but then the
> second request receives a 400 bad request error. What happens after that
> varies; sometimes the requests continue to alternate ok/400/ok/400/...
> other times the requests seem to go through okay but it appears that the
> wrong packages are retrived; I get size mismatch errors on all of the
> packages except the first two. Other times the connection simply sits
> and waits for a very long time, then (apparently) times out and uses a
> new connection, which works immediately.
> 
> I believe the fix for this would be to configure apt to re-connect to
> the server every time in a pure http/1.0 stateless sort of way. However,
> there doesn't seem to be a preference for this. I have tried
> Acquire::http::Pipeline-Depth 0, which had some effect, but not nearly
> enough.
> 
> Another solution is to simply switch to ftp downloads, but presumably
> there are disadvantages to this too or it would be the default.
> 
> Is there a way to tell apt to make every package a separate http
> request? Or do I have to switch to ftp?

Personally, I'd switch to ftp. There are no inherit problems with ftp,
it's just that http is more "universal", for lack of a better
term. Many people behind firewalls don't have any type of access to
the outside world except email and http. Plus, with all the emphasis
these days on the WWW there are highly optimized http servers and it
tends to put less of a load on the site you're using. But, if you have
trouble with http, I wouldn't hesitate to switch to ftp.

Gary

Reply via email to