Dear list, Concerning the concept of the deferred, why is it more useful to go with a deferred which gets consumed and can only be fired once?
In my small script I realize I need to take special care that the deferred has not been used, and that I must recreate explicitly a deferred for each network request. In a parallel world someone might have come up with a deferred concept which happily fires the callback as many times as there data coming back from the server. Is it a dumb idea? Could the deferred design be part of the solution to the network problem of two requests passing each other as each ends is not yet aware that the other has just sent a request? Buggy networks nodes would expect a response but get a request instead and go crazy... Thanks for your help, _______________________________________________ Twisted-Python mailing list Twisted-Python@twistedmatrix.com http://twistedmatrix.com/cgi-bin/mailman/listinfo/twisted-python