How about:

import socket, urllib2

timeout = 10
socket.setdefaulttimeout(timeout)
try:
auth_handler = urllib2.HTTPBasicAuthHandler()
opener = urllib2.build_opener(auth_handler) #this used if we need 
authentication
urllib2.install_opener(opener)
req = urllib2.Request('http://website.com')
f = urllib2.urlopen(req)
notes= f.readlines()
f.close()
print "Everything is ok"
except IOError, r:
p = str(r)
if re.search(r'urlopen error timed out',p):
print "Web page timed out"

You'll need to set up the timeout to whatever duration your website 
takes to load.
Cheers
Astan

?? wrote:
> Howdy, all,
>      I want to use python to detect the accessibility of website.
> Currently, I use urllib
> to obtain the remote webpage, and see whether it fails. But the problem is 
> that
> the webpage may be very large; it takes too long time. Certainly, it
> is no need to download
> the entire page. Could you give me a good and fast solution?
>     Thank you.
> --
> ShenLei
>   
-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to