On May 28, 7:47 pm, "Martin P. Hellwig" <martin.hell...@dcuktec.org> wrote: > On 05/28/10 21:44, Adam Tauno Williams wrote: > > > On Fri, 2010-05-28 at 15:41 +0100, Martin P. Hellwig wrote: > >> On 05/28/10 13:17, Adam Tauno Williams wrote: > >> <cut> > >>> You should be able to point it any any file-like object. But, again, > >>> why? > >>> If you have the data in the process why send it tostdoutand redirect > >>> it. Why not just send the data to the client directly? > >> Well you might want to multiplex it to more then one client, not saying > >> that this is the case here, just something I imagine possible. > > > That still doesn't make sense. Why 'multiplexstdout'? Why not just > > multiplex the data into proper IPC channels in the first place? > > I am going on a stretch here, I mostly agree with you, just trying to > illustrate that there could be corner cases where this is sensible. > The current situation could be that there is a client/server program > (binary only perhaps) which is not multi-user safe. > > Python can be used as a wrapper around the server to make it > multi-client, by emulating the exact behavior towards the client, the > client program does not have to be changed. > > -- > mph
Hi, This is the setup I was asking about. I've got users using a python-written command line client. They're requesting services from a remote server that fires a LaTeX process. I want them to see the stdout from the LaTeX process. I was using multiprocessing to handle the requests, but the stdout shows up on the server's terminal window where I started the server.serve_forever process. I started using RPyC and now the stdout appears on the client terminal making the request. I was trying to minimize the number of packages I use, hoping I could get the same capability from multiprocessing that I get with RPyC. thanks for the comments. I'm still processing what's been written here. --Tim -- http://mail.python.org/mailman/listinfo/python-list