En Sat, 24 Oct 2009 06:40:08 -0300, John O'Hagan <resea...@johnohagan.com>
escribió:
I have several instances of the same generator function running
simultaneously, some within the same process, others in separate
processes. I
want them to be able to share data (the dictionaries passed to them as
arguments), in such a way that instances designated as "leaders" send
their
dictionaries to "follower" instances.
I'm trying to use sockets to relay the dicts in pickled form, like this:
from socket import socket
PORT = 2050
RELAY = socket()
RELAY.bind(('', PORT))
RELAY.listen(5)
PICKLEDICT = ''
while 1:
INSTANCE = RELAY.accept()[0]
STRING = INSTANCE.recv(1024)
if STRING == "?":
INSTANCE.send(PICKLEDICT)
else:
PICKLEDICT = STRING
What I was hoping this would do is allow the leaders to send their dicts
to
this socket and the followers to read them from it after sending an
initial
"?", and that the same value would be returned for each such query until
it
was updated.
But clearly I have a fundamental misconception of sockets, as this logic
only
allows a single query per connection, new connections break the old
ones, and
a new connection is required to send in a new value.
You may use sockets directly, but instead of building all infrastructure
yourself, use a ThreadingTCPServer (or ForkingTCPServer), they allow for
simultaneous request processing. Even setting up a SimpleXMLRPCServer
(plus either ThreadingMixIn or ForkingMixIn) is easy enough.
Are sockets actually the best way to do this? If so, how to set it up to
do
what I want? If not, what other approaches could I try?
See the wiki page on distributed systems:
http://wiki.python.org/moin/DistributedProgramming
--
Gabriel Genellina
--
http://mail.python.org/mailman/listinfo/python-list