On Nov 26, 10:54 am, bhunter <[EMAIL PROTECTED]> wrote: > Hi, > > I've used subprocess with 2.4 several times to execute a process, wait > for it to finish, and then look at its output. Now I want to spawn > the process separately, later check to see if it's finished, and if it > is look at its output. I may want to send a signal at some point to > kill the process. This seems straightforward, but it doesn't seem to > be working. > > Here's my test case: > > import subprocess, time > > cmd = "cat somefile" > thread = subprocess.Popen(args=cmd.split(), shell=True, > stdout=subprocess.PIPE, stdin=subprocess.PIPE, > stderr=subprocess.STDOUT, close_fds=True) > > while(1): > time.sleep(1) > if(thread.returncode): > break > else: > print thread.returncode > > print "returncode = ", thread.returncode > for line in thread.stdout: > print "stdout:\t",line > > This will just print the returncode of None forever until I Ctrl-C it. > > Of course, the program works fine if I call thread.communicate(), but > since this waits for the process to finish, that's not what I want. > > Any help would be appreciated.
I've read that this sort of thing can be a pain. I'm sure someone will post and have other views though. I have had some success using Python's threading module though. There's a pretty good walkthrough here (it uses wxPython in its example): http://wiki.wxpython.org/LongRunningTasks Other places of interest include: http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/491281 http://uucode.com/texts/pylongopgui/pyguiapp.html http://sayspy.blogspot.com/2007/11/idea-for-process-concurrency.html If I were doing something like this, I would have the process write it's output to a file and periodically check to see if the file has data. Hopefully someone with more knowledge will come along soon. Mike -- http://mail.python.org/mailman/listinfo/python-list