I have a process that takes a list of URLs, uses client.getPage() to
retrieve the data, writes the contents to disk, and then moves the
file to another directory.
# get feed list - this is a list tuples
# url, name, username, password
url_tuples = feed_list(feed_file)
global C_URLS
d = defer.succeed(log_start(log))
for tup_url in url_tuples:
C_URLS += 1
d.addCallback(get_page, tup_url)
d.addErrback(get_page_error, tup_url[0])
d.addCallback(page_to_file, tup_url)
d.addErrback(page_to_file_error)
d.addCallback(file_to_rsync_queue)
d.addErrback(file_to_rsync_queue)
d.addCallback(stop_working)
#d.addErrback(self.gotError, (feed[0], 'while stopping'))
reactor.run()
I want to put this into a loop and run it as a daemon, but the looping
is causing me a problem. I've tried making the above into a procedure
(minus the stop working call back and reactor.run()) and it does not
run. Any suggestions on how to encapsulate this into loop would be
gratefully appreciated.
_______________________________________________
Twisted-Python mailing list
[email protected]
http://twistedmatrix.com/cgi-bin/mailman/listinfo/twisted-python