Threads, signals and sockets (on UNIX)

2007-06-11 Thread geoffbache
Hi all,

I have a Python program (on UNIX) whose main job is to listen on a
socket, for which I use the SocketServer module. However, I would also
like it to be sensitive to signals received, which it isn't if it's
listening on the socket. ("signals can only be received between atomic
actions of the python interpreter", presumably - and control will not
return to Python unless something appears on the socket). Does anyone
have a tip of a good way to do this?

I can of course put the SocketServer in a thread and call
signal.pause() in the main thread, but this falls down when the
SocketServer terminates normally, as the signal.pause() cannot be
interrupted except via signals. So I tried sending a "dummy" signal
(SIGCHLD) from the thread when the SocketServer terminates, which
seems to work on Linux but not Solaris. And which in any case feels a
bit hackish - surely there has to be a better way?

Regards,
Geoff Bache

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Threads, signals and sockets (on UNIX)

2007-06-11 Thread geoffbache

> Twisted *should* be able to do this, as it uses non-blocking IO.
>
> http://twistedmatrix.com/trac/

Thanks for the tip. I'll take a look if nobody has any better
suggestions.

It still seems to me that what I'm trying to do is essentially quite
simple, and shouldn't require
as large a tool as Twisted to fix it. Isn't Twisted basically for web
applications?

Geoff

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Threads, signals and sockets (on UNIX)

2007-06-11 Thread geoffbache

>
> You could probably use the Asyncore stuff to do it as well (with a lot
> less stuff).

This looked interesting. But it seems the asyncore stuff operates at
the socket level,
whereas I've currently just got a standard synchronous SocketServer
and the socket
operations themselves are kind of hidden beneath this layer. Can you
point me at anything
that might tell me how to combine Asyncore with SocketServer,
preferably without having to
mess with the internals of SocketServer too much :)

Geoff


-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Threads, signals and sockets (on UNIX)

2007-06-11 Thread geoffbache
On Jun 11, 2:08 pm, Jean-Paul Calderone <[EMAIL PROTECTED]> wrote:
> On Mon, 11 Jun 2007 04:56:43 -0700, geoffbache <[EMAIL PROTECTED]> wrote:
>
> >> Twisted *should* be able to do this, as it uses non-blocking IO.
>
> >>http://twistedmatrix.com/trac/
>
> >Thanks for the tip. I'll take a look if nobody has any better
> >suggestions.
>
> Twisted is a pretty good suggestion in general. ;)
> >It still seems to me that what I'm trying to do is essentially quite
> >simple, and shouldn't require
> >as large a tool as Twisted to fix it. Isn't Twisted basically for web
> >applications?
>
> Twisted supports HTTP, but it does plenty of other things too.  Generally
> speaking, it's useful for any network application, plus some other stuff.
>

My application is only incidentally a network application. It doesn't
have clients
and servers as such, it just distributes its work via a grid engine
and then lets
these workers communicate back their results via sockets.

> You're half right about this being simple though, and not needing Twisted
> to solve the problem.  The only thing you need to do to solve the problem
> is avoid using either signals or threads.  Interaction between the two is
> very complicated and, as you've noticed, varies across platforms.  Twisted
> is solving the problem for you here by letting you do I/O without using
> threads, making signals *almost* simple.
>

Yes, I would avoid signals or threads if I could, but it's tough. The
program
is supposed to "appear" to just be a batch process, so handling e.g.
ctrl-C is essential.
The standard SocketServer doesn't allow for this, so I need some other
thread of control
that will, or some means of "asynchronising" SocketServer internally.

Or there's always the really hacky low tech solution which has a
certain appeal : have
the main thread check all the others for being alive and sleep in
between...

Geoff

-- 
http://mail.python.org/mailman/listinfo/python-list


Getting subprocesses to be hidden on Windows

2007-08-27 Thread geoffbache
Hi,

As part of my efforts to write a test tool that copes with GUIs
nicely, I'm trying to establish how I can start a GUI process on
Windows that will not bring up the window. So I try to hide the window
as follows:

info = subprocess.STARTUPINFO()
info.dwFlags |= subprocess.STARTF_USESHOWWINDOW
info.wShowWindow = subprocess.SW_HIDE

proc = subprocess.Popen(..., startupinfo=info)

This works, in a way, but doesn't work recursively. I.e. if the
started process itself starts a window, that second window will not be
hidden. This even applies to dialog boxes within the application. So
instead of a lot of windows popping up I now get a lot of disembodied
dialogs appearing, which is a slight improvement but not much.

Also, certain processes (e.g. tkdiff) seem to ignore the directive to
be hidden altogether.

This is dead easy on UNIX with virtual displays like Xvfb. Can someone
shed any light if it's possible on Windows from python?

Regards,
Geoff Bache

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Getting subprocesses to be hidden on Windows

2007-08-28 Thread geoffbache
On Aug 27, 11:28 pm, [EMAIL PROTECTED] wrote:
> On Aug 27, 3:21 pm, geoffbache <[EMAIL PROTECTED]> wrote:
>
>
>
> > Hi,
>
> > As part of my efforts to write a test tool that copes with GUIs
> > nicely, I'm trying to establish how I can start a GUI process on
> > Windows that will not bring up the window. So I try to hide the window
> > as follows:
>
> > info = subprocess.STARTUPINFO()
> > info.dwFlags |= subprocess.STARTF_USESHOWWINDOW
> > info.wShowWindow = subprocess.SW_HIDE
>
> > proc = subprocess.Popen(..., startupinfo=info)
>
> > This works, in a way, but doesn't work recursively. I.e. if the
> > started process itself starts a window, that second window will not be
> > hidden. This even applies to dialog boxes within the application. So
> > instead of a lot of windows popping up I now get a lot of disembodied
> > dialogs appearing, which is a slight improvement but not much.
>
> > Also, certain processes (e.g. tkdiff) seem to ignore the directive to
> > be hidden altogether.
>
> > This is dead easy on UNIX with virtual displays like Xvfb. Can someone
> > shed any light if it's possible on Windows from python?
>
> > Regards,
> > Geoff Bache
>
> I'm confused. Why would you create a GUI if you're not going to
> actually display it? Isn't that the point of a GUI? Or are you talking
> about the command window popping up?
>
> Mike

Only in the context of testing it. If I run lots of GUI tests on my
computer I want
the tested GUIs to remain hidden so I can still use my computer in the
meantime...

Though if you can tell me how to stop the command window popping up on
Windows
I'll be grateful for that too (though it wasn't the original
question).

Geoff

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Getting subprocesses to be hidden on Windows

2007-08-28 Thread geoffbache

> Which GUI toolkit are you using? Tkinter, wxPython, pyQt?

Primarily PyGTK, but I was hoping it wouldn't matter. I hope to be
able
to start the process as indicated in the original post from within my
test
tool and instruct the subprocess to be hidden (or minimized? would
that be easier?),
irrespective of what it was (it might be a Java GUI or anything for
all I care...)

> As for
> losing the command window on Windows, the best way that I know of is
> to just change the extension of the python file itself from *.py to
> *.pyw . I'm pretty sure you can suppress command windows if you're
> calling them from the command line using a flag, but I can't recall
> the flag off the top of my head.
>

Thanks, that seemed to work.

> One way to test while still being able to use your computer is to
> install a virtual machine with VMWare or some similar product. I use
> VMWare's free software for testing some of my scripts, but I've heard
> that Microsoft's got a free virtual product that isn't half bad.

OK. If all else fails I might try that. But if there is a solution to
the original
problem it would be nice not to have to install VMWare everywhere for
convenient testing...

Geoff


-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Getting subprocesses to be hidden on Windows

2007-08-28 Thread geoffbache

OK, more background needed. I develop the TextTest tool which is a
generic test tool that starts tested applications from
the command line. The idea is that it can handle any system under test
at all, whatever language it's written in. Preferably
without requiring a bunch of changes to the tested code before
starting. I'd like to be able to pass some sort of flag to
ensure that the system under test *and everything it starts* remain
hidden.

I can do as you suggest in my PyGTK GUI, of course, but that's only
one system under test. A generic solution, if there
is one, would be much better. I felt like there ought to be one
because :

a) It's easy on UNIX and
b) I managed to hide the system under test fairly easily, just not its
child windows and dialogs.

Thanks for the help, anyway, it's another fallback if I can't find a
solution.

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Getting subprocesses to be hidden on Windows

2007-08-28 Thread geoffbache
On 28 Aug, 18:18, Larry Bates <[EMAIL PROTECTED]> wrote:
> geoffbache wrote:
> > Hi,
>
> > As part of my efforts to write a test tool that copes with GUIs
> > nicely, I'm trying to establish how I can start a GUI process on
> > Windows that will not bring up the window. So I try to hide the window
> > as follows:
>
> > info = subprocess.STARTUPINFO()
> > info.dwFlags |= subprocess.STARTF_USESHOWWINDOW
> > info.wShowWindow = subprocess.SW_HIDE
>
> > proc = subprocess.Popen(..., startupinfo=info)
>
> > This works, in a way, but doesn't work recursively. I.e. if the
> > started process itself starts a window, that second window will not be
> > hidden. This even applies to dialog boxes within the application. So
> > instead of a lot of windows popping up I now get a lot of disembodied
> > dialogs appearing, which is a slight improvement but not much.
>
> > Also, certain processes (e.g. tkdiff) seem to ignore the directive to
> > be hidden altogether.
>
> > This is dead easy on UNIX with virtual displays like Xvfb. Can someone
> > shed any light if it's possible on Windows from python?
>
> > Regards,
> > Geoff Bache
>
> While I'm not entirely sure I understand what you want, I think you can
> accomplish it by using win32CreateProcess instead of subprocess.  You can run
> the application minimized or perhaps in a REALLY small window.  If you have
> modal dialog boxes, I don't think you can do anything as they don't run in the
> parent windows frame but rather outside (I could be wrong about this).
>
> -Larry

Hi Larry,

I don't know if that would help. I've tried running minimized from the
command line as
suggested by Mike and that has the same issue (child windows and
dialogs don't get minimized)
So the question is moving away from how to technically achieve this in
Python to whether
Windows even supports it...

Geoff

-- 
http://mail.python.org/mailman/listinfo/python-list


How to convert markup text to plain text in python?

2008-02-01 Thread geoffbache
I have some marked up text and would like to convert it to plain text,
by simply removing all the tags. Of course I can do it from first
principles but I felt that among all Python's markup tools there must
be something that would do this simply, without having to create an
XML parser etc.

I've looked around a bit but failed to find anything, any tips?

(e.g. convert "Today is Friday" to "Today is Friday")

Regards,
Geoff
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Problems with background processes on Windows

2009-03-28 Thread geoffbache

Hi Tim,

> If you trace through this:
>     python -m trace --trace communicate.py
>
> you'll see that it hangs in subprocess in the stdout_thread waiting for
> stdout to close.
>

Thanks for this tip, haven't used this before.

> I'm not sure I expect this to work as you expect.  When you open a null
> device, it's just another file handle.  Why should the behavior be
> different?

Well yes, but the point is surely that the standard output of the
background sleeping process is pointed to a different location? (you
can replace the null device with a file name of your choice and the
point is the same.) This process should not have any connection to the
standard output of sleep.py, and hence we shouldn't need to wait for
it to finish when collecting the standard output of sleep.py, surely?
(Even explicitly calling sys.stdout.close() in sleep.py doesn't seem
to help)

Also, the code behaves differently on Linux and Windows, and I haven't
called anything that is documented as behaving differently on
different platforms. If it's a genuine OS difference I'd be interested
in hearing why.

To put this another way: what can I do in sleep.py that allows me to
start a time-consuming background process and exit before it's
complete, while not forcing a process that is "communicating" with me
to also wait for this background process before proceeding?

Regards,
Geoff
--
http://mail.python.org/mailman/listinfo/python-list


Re: Problems with background processes on Windows

2009-03-30 Thread geoffbache
On Mar 30, 6:57 am, Gabriel Genellina  wrote:
> Gabriel Genellina  yahoo.com.ar> writes:
>
>
>
> > En Sat, 28 Mar 2009 06:03:33 -0300, geoffbache 
> jeppesen.com>  
> > escribió:
>
> > > Well yes, but the point is surely that the standard output of the
> > > background sleeping process is pointed to a different location? (you
> > > can replace the null device with a file name of your choice and the
> > > point is the same.) This process should not have any connection to the
> > > standard output of sleep.py, and hence we shouldn't need to wait for
> > > it to finish when collecting the standard output of sleep.py, surely?
> > > (Even explicitly calling sys.stdout.close() in sleep.py doesn't seem
> > > to help)
>
> > Thesis: When the subprocess module creates the child process, it inherits  
> > the stdin/stdout/stderr handles from its parent (even if its own  
> > stdin/stdout/stderr are redirected; they're different). Until the  
> > grandchild process finishes, the grandparent stdout.read() won't return,  
> > because the pipe isn't closed until the last handle to it is closed.
>
> I've confirmed the above description.
>
> --- begin p0.py ---
> import subprocess,os
>
> p1 = subprocess.Popen(["python", "p1.py"],
>       stdout=subprocess.PIPE,
>       stderr=open(os.devnull, "wt"),
>       stdin=open(os.devnull))
> print p1.communicate()
> --- end p0.py ---
>
> --- begin p1.py ---
> import subprocess,sys,os,msvcrt
>
> subprocess.Popen(
>     ["python", "p2.py", str(msvcrt.get_osfhandle(sys.stdout.fileno()))],
>     stdout=open(os.devnull, "wt"),
>     stderr=open(os.devnull, "wt"),
>     stdin=open(os.devnull, "rt"))
> print "exit p1.py"
> --- end p1.py ---
>
> --- begin p2.py ---
> import sys, win32api, time, os
>
> with open("p2.pid","wt") as f: f.write("%d" % os.getpid())
> win32api.CloseHandle(int(sys.argv[1]))
> time.sleep(30)
> --- end p2.py ---
>
> p2 has to close the inherited file handle corresponding to p1's stdout. Then,
> when p1 itself finishes, the writing end of the pipe is actually closed and p0
> can continue.
>
> C:\TEMP\subp>python p0.py
> ('exit p1.py\r\n', None)
>
> C:\TEMP\subp>type p2.pid
> 3018
> C:\TEMP\subp>tasklist | find "python.exe"
> python.exe                  3018                         0     4.304 KB
>
> I'm unsure this could be considered a bug in subprocess - the documentation
> says that parameter close_fds=True is not supported on Windows (so, the child
> process inherits all open files, and this includes the parent's
> stdin/stdout/stderr).
>
> At the end, this is due to the fact that file handles 0, 1, 2 have no special
> significance on Windows - it's the C runtime library which makes such things
> special, not the OS.

Thanks for this investigation.

I'm not the right person to pronounce on whether this is a Python bug
or not but I certainly found the behaviour surprising and unhelpful.
When you redirect output you expect it to be redirected, not for some
connection to be invisibly maintained to the original
location. I still don't really understand why sys.stdout.close() in p1
doesn't help either, is there some other way I can just ditch the
standard output from within p1?

Actually, I'm quite happy if somebody can tell me how to work around
it, this is just a test I wrote to simulate some behaviour in my
system (which is the thing calling communicate(), i.e. p0). So both
the child process and grandchild process are just part of the test
(the real versions aren't python programs and p2 runs remotely).

Is there anything I can do to either of these processes to fix this
issue without importing non-standard libraries (i.e. import win32api
is out)?

/Geoff
--
http://mail.python.org/mailman/listinfo/python-list


Converting a simple python script to a simple windows executable

2008-06-11 Thread geoffbache
Hi all,

I have a small python script that doesn't depend on anything except
the standard interpreter. I would like to convert it to a small .exe
file on Windows that can distributed alone without introducing
additional dependencies. I need to assume, because of other python
scripts, that anyone using this has python installed anyway so I hoped
it would be possible to do this. (Why I want to do this is a bit
involved but I can explain if necessary)

Unfortunately, it seems to be harder than it should be. I tried

(1) py2exe. This is really for when python isn't installed on the
remote user's machine, so it requires you to distribute a large amount
of DLLs etc which are part of the python installation. A bit silly
when I know that the remote user has python anyway.

(2) setuptools. This works but requires that the remote user installs
setuptools also. Something of a shame when I don't require any
installation procedure at the moment.

(3) create a small .bat file to call the python script and then try to
compile it to .exe. There are hundreds of bat2exe tools out there but
many of them seem to produce an executable that runs the script in a
separate command window, which differs from the .bat behaviour and
isn't what I want. That's not including the various freeware ones that
just fail and the large number of ones that want my money :)

Anyone have any better ideas?

Geoff Bache
--
http://mail.python.org/mailman/listinfo/python-list


Re: Converting a simple python script to a simple windows executable

2008-06-11 Thread geoffbache
On Jun 11, 9:49 pm, jay graves <[EMAIL PROTECTED]> wrote:
> On Jun 11, 2:25 pm, geoffbache <[EMAIL PROTECTED]> wrote:
>
> > Anyone have any better ideas?
>
> How about ExeMaker?
>
> http://effbot.org/zone/exemaker.htm
>
> I have not used it but it seems to do what you want.
>
> ...
> Jay

Thanks, this looks very promising! Will try out a bit more tomorrow
but I think it should work.

Regards,
Geoff
--
http://mail.python.org/mailman/listinfo/python-list


Re: Converting a simple python script to a simple windows executable

2008-06-13 Thread geoffbache

Thanks for all the suggestions. I have eventually used a heavily
edited version of ExeMaker which seems to do what I want.

Geoff
--
http://mail.python.org/mailman/listinfo/python-list


Annoying message when interrupting python scripts

2008-06-17 Thread geoffbache
Hi all,

I find that I semi-frequently get the cryptic message

import site failed; use -v for traceback

printed on standard error when an arbitrary python script receives
SIGINT while the python interpreter
is still firing up. If I use -v for traceback I get something along
the lines of

'import site' failed; traceback:
Traceback (most recent call last):
  File "/usr/lib/python2.4/site.py", line 61, in ?
import os
  File "/usr/lib/python2.4/os.py", line 683, in ?
import copy_reg as _copy_reg
  File "/usr/lib/python2.4/copy_reg.py", line 5, in ?
"""
KeyboardInterrupt

Is this a bug? I couldn't find any code, but I imagine something like
try:
 import site
except:
 sys.stderr.write("import site failed; use -v for traceback\n")

which should surely allow a KeyboardInterrupt exception through?

Regards,
Geoff Bache
--
http://mail.python.org/mailman/listinfo/python-list


Re: Annoying message when interrupting python scripts

2008-06-17 Thread geoffbache

To clarify: this is more serious than an incorrect error message, as
the intended interrupt gets swallowed and
script execution proceeds. Sometimes I seem to get half-imported
modules as well,
the script failing later with something like

AttributeError: 'module' object has no attribute 'getenv'

when trying to call os.getenv

Regards,
Geoff Bache
--
http://mail.python.org/mailman/listinfo/python-list


Re: Annoying message when interrupting python scripts

2008-06-18 Thread geoffbache

Ben is correct in his interpretation of what I'm trying to say. The
code "should surely be changed" so that it lets a KeyboardInterrupt
exception through.

Geoff
--
http://mail.python.org/mailman/listinfo/python-list


Re: Annoying message when interrupting python scripts

2008-06-19 Thread geoffbache

As nobody decried the idea of this being a bug, it now is :)

http://bugs.python.org/issue3137

/Geoff
--
http://mail.python.org/mailman/listinfo/python-list


Terminating processes on Windows (handles and IDs)

2008-06-23 Thread geoffbache
Hi all,

I've always wondered why os.kill isn't supported on Windows. I found a
discussion somewhere from 2006 about this so it seems others have
wanted it, but still nothing. So I have a half-baked solution
involving calling "taskkill" on Windows Vista or "tskill" on Windows
XP via the shell. I feel there has to be a better way.

I'm also fairly confused about when I've got an ID and when I've got a
handle. The subprocess module gives me IDs which the above programs
accept, but other ways of spawning processes give me process handles
(while referring to them as process IDs in the docs...) and I don't
know how to kill a process with these. Besides, I've found an
amazingly useful PyGTK method, gobject.child_watch_add, which does
exactly what I want on UNIX but wants process handles on Windows. So I
can't use it in conjunction with subprocess there, and if I use some
other way of spawning processes I can't clean them up later.

Is there any way to convert one of these numbers to the other? Or to
get a process handle out of subprocess?
(There must be one down there somewhere, surely?)

Sorry for rambling a bit, am confused.

Regards,
Geoff Bache
--
http://mail.python.org/mailman/listinfo/python-list


Re: Terminating processes on Windows (handles and IDs)

2008-06-24 Thread geoffbache

Thanks for the tip. This does seem rather overkill to introduce all
these dependencies just to be able to
kill a process though...

I've discovered that subprocess.Popen objects have a member "_handle"
which is undocumented but
appears to work, so I'm using that for now. Better suggestions
gratefully received...

Geoff

> My way to do it is using excellent wmi module by Tim Golden, which
> relies on Mark Hammond's pywin32 and Windows native wmi functionality.
> Here is the link -http://tgolden.sc.sabren.com/python/wmi.html
> Maybe, there is a more elegant way of doing that, but it works for me,
> and i feel nice with wmi.

--
http://mail.python.org/mailman/listinfo/python-list


Re: Terminating processes on Windows (handles and IDs)

2008-06-25 Thread geoffbache

Thanks for the help Tim!

Good to see this is being sorted in Python at last, although it'll be
some time
before I can use only Python 2.6 I suspect...

I'm making use of _handle now and it works - most of the time.
The remaining issues are probably PyGTK problems rather than python
ones though,
and hence off topic here.

Regards,
Geoff
--
http://mail.python.org/mailman/listinfo/python-list


Windows process ownership trouble

2008-06-25 Thread geoffbache
Am currently being very confused over the following code on Windows

import subprocess, os

file = open("filename", "w")
try:
proc = subprocess.Popen("nosuchprogram", stdout=file)
except OSError:
file.close()
os.remove("filename")

This produces the following exception:

Traceback (most recent call last):
  File "C:\processown.py", line 10, in 
os.remove("filename")
WindowsError: [Error 32] The process cannot access the file because it
is being used by another process: 'filename'

How can it be in use by another process? The process didn't even
start, right?

Would appreciate some help: is this a Python bug, or a Windows bug, or
just me being confused...?
--
http://mail.python.org/mailman/listinfo/python-list


Re: Windows process ownership trouble

2008-06-26 Thread geoffbache
Thanks Tim, very helpful again.

I've now reported this as http://bugs.python.org/issue3210
and implemented your suggested workaround.

Regards,
Geoff

On Jun 25, 9:19 pm, Tim Golden <[EMAIL PROTECTED]> wrote:
> geoffbache wrote:
> > Am currently being very confused over the following code on Windows
>
> > import subprocess, os
>
> > file = open("filename", "w")
> > try:
> > proc = subprocess.Popen("nosuchprogram", stdout=file)
> > except OSError:
> > file.close()
> > os.remove("filename")
>
> Forgot to say: slightly awkward, but you can work around
> it like this:
>
> 
> import os
> import subprocess
>
> f = open ("filename", "w")
> try:
>proc = subprocess.Popen ("blah", stdout=f)
> except OSError:
>os.close (f.fileno ())
>
> os.remove ("filename")
>
> 
>
> TJG

--
http://mail.python.org/mailman/listinfo/python-list


Re: Windows process ownership trouble

2008-06-26 Thread geoffbache

Tim,

Unfortunately my previous message was premature, it seems your
workaround doesn't work
either on my system (Windows XP, Python 2.5.1) I get the following
printed out

Traceback (most recent call last):
  File "C:\TextTest\processown.py", line 12, in 
os.remove ("filename")
WindowsError: [Error 32] The process cannot access the file because it
is being used by another process: 'filename'
close failed: [Errno 9] Bad file descriptor

Any ideas?

Geoff
--
http://mail.python.org/mailman/listinfo/python-list


Re: Windows process ownership trouble

2008-06-26 Thread geoffbache

Tim,

I copied your code exactly from my browser and ran it, so I don't
think there was a typo.
I could upgrade to Python 2.5.2 I suppose to compare and contrast, but
I need to support older
Python versions anyway so it's a bit academic...

Your speculation about garbage collection did set me going, however,
and I discovered
that the following code does work on my system, so now I have a
functional workaround:

import os
import subprocess

def runProcess():
   f = open ("filename", "w")
   try:
  proc = subprocess.Popen ("blah", stdout=f)
   except OSError:
  f.close ()

runProcess()
os.remove ("filename")

So it seems that some things are only being garbage collected when the
function exits, but not when the
except clause exits or when the exception is thrown.

Geoff
--
http://mail.python.org/mailman/listinfo/python-list


Persuading ConfigParser to give me the section elements in the same order as the file

2008-09-10 Thread geoffbache
Hi all,

I recently needed to parse a file that was perfect for ConfigParser
apart from one thing: the elements in the sections, although
definitions, could in some cases clash with each other and therefore
it was important to be able to retrieve them in the same order as they
appeared in the file.

Unfortunately ConfigParser uses ordinary dictionaries for the section
elements and they are therefore returned in an arbitrary order.

The only solution I found was to copy ConfigParser.py and replace all
the dictionaries with "sequential dictionaries"
which are exactly like dictionaries except that elements are returned
in the order they were inserted. (see
http://home.arcor.de/wolfgang.grafen/Python/Modules/seqdict/Seqdict.html)

I wonder if there was a better way? For example, is there any hook
that could modify what is created by the statement

x = {}

I tried setting

__builtins__.dict = ndict.seqdict

But that didn't seem to have any effect on the above statement.

As a secondary question, I find sequential dictionaries to be an
essential part of programming in Python and I use them all the time. I
wondered a bit if there were any plans or proposals to include them as
part of the Python library?

Regards,
Geoff Bache
--
http://mail.python.org/mailman/listinfo/python-list


Re: Persuading ConfigParser to give me the section elements in the same order as the file

2008-09-11 Thread geoffbache

Hi Matt,

> Have a look at this:http://www.python.org/dev/peps/pep-0372/
>

Thanks, that was very useful. Good to know these things are being
considered.

> Looking at the config parser module, it looks like there are only a
> couple of places where {} is used. I would create a mixin class to
> replace the offending methods. That should work because it looks like
> you only have to replace "__init__" and "add_section". So...
>
> class OrderedConfigParserMixin:
>     def __init__(self, defaults=None):
>         self._sections = ndict.seqdict()
>         self._defaults = ndict.seqdict()
>         if defaults:
>             for key, value in defaults.items():
>                 self._defaults[self.optionxform(key)] = value
>
>     def add_section(self, section):
>         """Create a new section in the configuration.
>
>         Raise DuplicateSectionError if a section by the specified name
>         already exists.
>         """
>         if section in self._sections:
>             raise DuplicateSectionError(section)
>         self._sections[section] = ndict.seqdict()
>
> # Then you can use this to create your own ordered config parsers.
> Note that
> # multiple inheritance in python uses a breadth first search. If you
> want
> # the methods on your mixin to get called instead of the methods on
> the
> # original class you must include the mixin first.
>
> from ConfigParser import RawConfigParser, ConfigParser,
> SafeConfigParser
>
> class OrderedRawConfigParser(OrderedConfigParserMixin,
> RawConfigParser):
>     pass
>
> class OrderedConfigParser(OrderedConfigParserMixin, ConfigParser):
>     pass
>
> class OrderedSafeConfigParser(OrderedConfigParserMixin,
> SafeConfigParser):
>     pass
>
> I don't know if this is the _best_ approach, but it is certainly much
> preferred over monkey patching the built-ins module. Note that I
> haven't tested any of the above code.

Yes, I tried this first. But actually you missed the main place where
dictionaries are created, which is the monster method _read, the line
being

cursect = {'__name__': sectname}

I thought by the time I'd copied that whole method just to edit that
line I may as well just copy the whole file and forget the
inheritance :)

btw, the PEP you pointed me at indicated ConfigParser will take a
dict_type argument for exactly this purpose in Python 2.6, so I look
forward to when I can use that instead...

Regards,
Geoff
--
http://mail.python.org/mailman/listinfo/python-list


PYTHONPATH and eggs

2010-03-03 Thread geoffbache
Hi all,

I have a very simple problem that seems to have no simple solution.

I have a module which is installed centrally and lives in a Python
egg. I have experimented with some minor changes to it and would like
to set my PYTHONPATH to pick up my local copy of it, but don't want to
have to figure out how to build my own version of the "egg" if
possible.

Unfortunately, the location from PYTHONPATH ends up after the eggs in
sys.path so I can't persuade Python to import my version. The only way
I've found to fix it is to copy the main script and manually hack
sys.path at the start of it which isn't really very nice. I wonder if
there is any better way as I can't be the first person to want to do
this, surely?

I've seen this issue has been discussed elsewhere and flagged as a
problem (e.g.
http://mail.python.org/pipermail/distutils-sig/2009-January/010755.html)

but I've been unable to find any suggestions for workarounds or
indications whether this will be/has been fixed.

Regards,
Geoff Bache
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: PYTHONPATH and eggs

2010-03-04 Thread geoffbache

On Mar 4, 3:24 am, David Cournapeau  wrote:
> On Wed, Mar 3, 2010 at 7:14 PM, geoffbache  wrote:
> > Unfortunately, the location from PYTHONPATH ends up after the eggs in
> > sys.path so I can't persuade Python to import my version. The only way
> > I've found to fix it is to copy the main script and manually hack
> > sys.path at the start of it which isn't really very nice. I wonder if
> > there is any better way as I can't be the first person to want to do
> > this, surely?
>
> One way is to never install things as eggs: I have a script
> hard_install which forces things to always install with
> --single-externally-managed blablabla. This has worked very well for
> me, but may not always be applicable (in particular if you are on a
> platform where building things from sources is difficult).

Thanks for the tips. Is your script generic at all? I wonder if you'd
be prepared to share it?

Figuring out virtualenv would also be an option, as would figuring out
how to build my own egg, but both these solutions feel like overkill
to me just to enable a small bit of tweaking.

/Geoff
-- 
http://mail.python.org/mailman/listinfo/python-list