Hi, I am having problems with apt-get which I believe are related to the fact that my ISP runs a caching proxy server which seems to transparently intercept all traffic sent to port 80 of anywhere. The symptom is that the first request to any server works fine, but then the second request receives a 400 bad request error. What happens after that varies; sometimes the requests continue to alternate ok/400/ok/400/... other times the requests seem to go through okay but it appears that the wrong packages are retrived; I get size mismatch errors on all of the packages except the first two. Other times the connection simply sits and waits for a very long time, then (apparently) times out and uses a new connection, which works immediately.
I believe the fix for this would be to configure apt to re-connect to the server every time in a pure http/1.0 stateless sort of way. However, there doesn't seem to be a preference for this. I have tried Acquire::http::Pipeline-Depth 0, which had some effect, but not nearly enough. Another solution is to simply switch to ftp downloads, but presumably there are disadvantages to this too or it would be the default. Is there a way to tell apt to make every package a separate http request? Or do I have to switch to ftp? Thanks, Stuart.

