On Tue, Aug 11, 2009 at 4:25 PM, Chris Withers<ch...@simplistix.co.uk> wrote: > Hi All, > > I'm using the following script to download a 150Mb file: > > from base64 import encodestring > from httplib import HTTPConnection > from datetime import datetime > > conn = HTTPSConnection('localhost') > headers = {} > auth = 'Basic '+encodestring('username:password').strip() > headers['Authorization']=auth > t = datetime.now() > print t > conn.request('GET','/somefile.zip',None,headers) > print 'request:',datetime.now()-t > response = conn.getresponse() > print 'response:',datetime.now()-t > data = response.read() > print 'read:',datetime.now()-t > > The output shows it takes over 20 minutes to do this. > However, this is on a local network, and downloading the same file in IE > takes under 3 seconds! > > I saw this issue: > > http://bugs.python.org/issue2576 > > I tried changing the buffer size to 4096 in a subclass as the issue > suggested, but I didn't see the reported speed improvement. > I'm using Python 2.6.2. > > Does anyone know of an alternative library for creating http requests and > getting their responses that's faster but hopefully has a similar interface? >
I tried to reproduce this, but I could not. Could you paste in the output of your script? Also on the same box where you run this script can you test with curl or wget? -- David blog: http://www.traceback.org twitter: http://twitter.com/dstanek -- http://mail.python.org/mailman/listinfo/python-list