[David Rasmussen] >> I use urllib2 to do some simple HTTP communication with a web server. >> In one "session", I do maybe 10-15 requests. It seems that urllib2 >> opens op a connection every time I do a request. Can I somehow make it >> use _one_ persistent connection where I can do multiple GET->"receive >> data" passes before the connection is closed?
[Diez B. Roggisch] > Are you sure HTTP supports that? Yes, HTTP 1.1 definitely supports multiple requests on the same connection. http://www.w3.org/Protocols/rfc2616/rfc2616-sec8.html#sec8.1 Some HTTP 1.0 clients supported persistent connections through the use of the non-standard "keep-alive" header. > And even if it works - what is the problem with connections being created? The URL above describes the benefits of persistent connections. The primary problem of the old style of one-request-per-connection is the creation of more sockets than are necessary. To the OP: neither urllib nor urllib2 implements persistent connections, but httplib does. See the httplib documentation page for an example. http://www.python.org/doc/2.4.2/lib/httplib-examples.html However, even httplib is "synchronous", in that it cannot pipeline requests: the response to the first request must be competely read before a second request can be issued. HTH, -- alan kennedy ------------------------------------------------------ email alan: http://xhaus.com/contact/alan -- http://mail.python.org/mailman/listinfo/python-list