Re: Securing a future for anonymous functions in Python
Alex Martelli wrote: > We should have an Evilly Cool Hack of the Year, and I nominate Paul du > Bois's one as the winner for 2004. Do I hear any second...? Thank you :-) I am busy writing it up as a recipe. I think I have a pleasing way for it to be portable, even. Unfortunately, that removes some of the evil. p -- http://mail.python.org/mailman/listinfo/python-list
Re: Overcoming herpetophobia (or what's up w/ Python scopes)?
kj wrote: > I am hoping that it may be better this time around. For one thing, > like Perl, Python was then (and maybe still is) a "work in progress." > So I figure that Python scoping may have improved since then. Even > if not, I think that Python is mature enough by now that adequate > alternatives must have been devised for the Perlish features that > I missed during my first attempt. I don't know of any tutorials, but given that you're familiar with the concept of closures and scopes I don't think you really need one to understand what Python does and doesn't provide. Python does allow you to refer to variables in lexically enclosing scopes, and it closes over them properly. Python (annoyingly) does _not_ provide a way to write to the closed-over variable. Assignments always go into the local scope. There is a "global" statement that allows you to "target" the module scope; one can imagine an analagous statement that allows you to target an intervening scope, but it doesn't exist. Some (incorrectly) interpret this to mean that the variable isn't actually closed-over. The following code executes without errors, which shows that the closure really does refer to the outer variable, and not some constant copy: def closure_test(): def make_closure(): def closure(): # a += 1 # this will give you an unbound local error at runtime return a return closure fn = make_closure() a = 1 assert fn() == a a = 2 assert fn() == a closure_test() The canonical way around this is to add a layer of indirection -- isn't that always the solution? def closure_test2(): def make_closure(): def closure(): a[0] += 1 return a[0] return closure fn = make_closure() a = [1] assert fn() == 2 assert fn() == 3 closure_test2() p -- http://mail.python.org/mailman/listinfo/python-list
Re: Deadlock detection
(warning: pedantic and off-topic response) NP-Complete does not mean "equivalent to the halting problem." It means "poly-time equivalent to any other NP-Complete problem". NP-Complete problems are "only" exponential-time. The halting problem is much harder! And of course, just the fact that a problem is NP-complete doesn't mean that you can't construct algorithms that do a pretty good job a pretty good fraction of the time. -- http://mail.python.org/mailman/listinfo/python-list
Re: Example Code - Named Pipes (Python 2.4 + ctypes on Windows)
Srijit Kumar Bhadra wrote: > However, I wish that there was more documentation of win32all beyond > existing PyWin32.chm. I suspect you have already used the "more documentation" -- it's the MSDN docs. p -- http://mail.python.org/mailman/listinfo/python-list
Re: non-blocking PIPE read on Windows
placid wrote: > What i need to do is, create a process using subprocess.Popen, where > the subprocess outputs information on one line (but the info > continuesly changes and its always on the same line) and read this > information without blocking, so i can retrieve other data from the > line i read in then put this in a GUI interface. > readline() blocks until the newline character is read, but when i use > read(X) where X is a number of bytes then it doesnt block(expected > functionality) but i dont know how many bytes the line will be and its > not constant so i cant use this too. I wrote something for this the other day. The class has a getline() method, which either returns a line or raises an exception instead of blocking. It also raises an exception instead of returning EOF. My use case had me reading from multiple processes at once; since select() doesn't work on files in win32, I had to get a little cheesy. I've appended the function that implements that use case, for reference. The central idea is to use PeekNamedPipe to figure out what's in the pipe. You can then read that data without fear of blocking. I wrote it quickly, therefore the code is a little embarassing, but... hopefully useful all the same. . class NoLineError(Exception): pass . class NoMoreLineError(Exception): pass . class liner(object): . """Helper class for multi_readlines.""" . def __init__(self, f): . self.fd = f.fileno() . self.osf = msvcrt.get_osfhandle(self.fd) . self.buf = '' . . def getline(self): . """Returns a line of text, or raises NoLineError, or NoMoreLineError.""" . try: . data, avail, _ = win32pipe.PeekNamedPipe(self.osf, 0) . except pywintypes.error: . # Pipe closed: give up what we have, then that's it . if self.buf: . ret, self.buf = self.buf, None . return ret . else: . raise NoMoreLineError . if avail: . self.buf += os.read(self.fd, avail) . . idx = self.buf.find('\n') . if idx >= 0: . ret, self.buf = self.buf[:idx+1], self.buf[idx+1:] . return ret . else: . raise NoLineError . . . def multi_readlines(fs): . """Read lines from |fs|, a list of file objects. . The lines come out in arbitrary order, depending on which files . have output available first.""" . if type(fs) not in (list, tuple): . raise Exception("argument must be a list.") . objs = [liner(f) for f in fs] . for i,obj in enumerate(objs): obj._index = i . while objs: . for i,obj in enumerate(objs): . try: . yield (obj._index, obj.getline()) . except NoLineError: . pass . except NoMoreLineError: . del objs[i] . break # Because we mutated the array -- http://mail.python.org/mailman/listinfo/python-list
Re: genexp surprise (wart?)
The generator is in its own scope. For proof, try accessing q outside the generator. There are two possibilities. The first is that you don't know what closures are and are complaining that python has them. That would be amusingly ironic, but I'm guessing you do know (if you don't, google "make_adder" and be enlightened) The second is that you don't like the late-binding behavior of generator expressions. PEP 289 has this to say: > After much discussion, it was decided that the first (outermost) > for-expression should be evaluated immediately and that the remaining > expressions be evaluated when the generator is executed. and: > After exploring many possibilities, a consensus emerged that [...] [for] > more complex applications, full generator definitions are always superior > in terms of being obvious about scope, lifetime, and binding And as an illustration of that last point, consider: def sieve_all(n = 100): # generate all primes up to n def filter_multiples(input, m): for q in input: if q % m != 0: yield q stream = iter(xrange(2, n)) while True: p = stream.next() yield p # this is now a redundant comment. # filter out all multiples of p from stream stream = filter_multiples(stream, p) p -- http://mail.python.org/mailman/listinfo/python-list
TypeError + generator + str.join(): Bug or user error?
Using win32 python 2.4.1, I have a minimal test program: def generate(): raise TypeError('blah') yield "" print "\n".join((generate())) Executing the program gives: Traceback (most recent call last): File "", line 5, in ? TypeError: sequence expected, generator found replacing TypeError with Exception gives what I would have expected: a traceback starting from the raise statement. I'm not relying on one behavior or the other, but I had a TypeError in a generator, and the funny exception slowed me down finding it. p -- http://mail.python.org/mailman/listinfo/python-list