Fredrik Lundh wrote:
> Steve Holden wrote:
>
> > You will need to import the socket module and then call
socket.setdefaulttimeout() to ensure that
> > communication with non-responsive servers results in a socket
exception that you can trap.
>
> or you can use asynchronous sockets, so your program
Steve Holden wrote:
> [EMAIL PROTECTED] wrote:
>
> > #import urllib, sys
> > #pages = ['http://www.python.org', 'http://xxx']
> > #for i in pages:
> > # try:
> > #u = urllib.urlopen(i)
> > #print u.geturl()
> > #except Exception, e:
> > #print >> sys.stderr, '%s: %s'
Steve Holden wrote:
> You will need to import the socket module and then call
> socket.setdefaulttimeout() to ensure that
> communication with non-responsive servers results in a socket exception that
> you can trap.
or you can use asynchronous sockets, so your program can keep processing
the
[EMAIL PROTECTED] wrote:
#import urllib, sys
#pages = ['http://www.python.org', 'http://xxx']
#for i in pages:
# try:
#u = urllib.urlopen(i)
#print u.geturl()
#except Exception, e:
#print >> sys.stderr, '%s: %s' % (e.__class__.__name__, e)
will print an error if a page
#import urllib, sys
#pages = ['http://www.python.org', 'http://xxx']
#for i in pages:
# try:
#u = urllib.urlopen(i)
#print u.geturl()
#except Exception, e:
#print >> sys.stderr, '%s: %s' % (e.__class__.__name__, e)
will print an error if a page fails opening, rest open
Title: RE: How to prevent the script from stopping before it should
[EMAIL PROTECTED]
#- I have a script that downloads some webpages.The problem is that,
#- sometimes, after I download few pages the script hangs( stops).
What do you mean with "hangs"?
It raises an error an
I have a script that downloads some webpages.The problem is that,
sometimes, after I download few pages the script hangs( stops).
(But sometimes it finishes in an excellent way ( to the end) and
download all the pages I want to)
I think the script stops if the internet connection to the server (fr