Re: [Python-Dev] PEP 340 -- loose ends
Phillip J. Eby wrote: > Specifically, I propose that PEP 340 *not* allow the use of "normal" > iterators. Instead, the __next__ and __exit__ methods would be an > unrelated protocol. This would eliminate the need for a 'next()' builtin, > and avoid any confusion between today's iterators and a template function > for use with blocks. I would extend this to say that invoking the blocktemplate decorator should eliminate the conventional iteration interface, preventing the following problematically silent bug: for l in synchronized(mylock): # This lock is not released promptly! break > My argument is that this is both Explicit (i.e., better than implicit) and > One Obvious Way (because using existing iterators just Another Way to do a > "for" loop). It also doesn't allow Errors (using an iterator with no > special semantics) to Pass Silently. While I agree these are advantages, a bigger issue for me would be the one above: keeping a block template which expects prompt finalisation from being inadvertently used in a conventional for loop which won't finalise on early termination of the loop. I'd also suggest that the blocktemplate decorator accept any iterator, not just generators. > Of course, since Practicality Beats Purity, I could give this all up. But > I don't think the Implementation is Hard to Explain, as it should be just > as easy as Guido's proposal. I think it would be marginally easier to explain, since the confusion between iterators and block templates would be less of a distraction. > Really, the only thing that changes is that you get a > TypeError when a template function returns an iterator instead of a block > template, and you have to use the decorator on your generators to > explicitly label them safe for use with blocks. I'd add raising a TypeError when a block template is passed to the iter() builtin to the list of differences from the current incarnation of the PEP. As for Phillip, I think using different API's is a good way to more clearly emphasise the difference in purpose between conventional for loops and the new block statement, but I'm also a little concerned about incorrectly passing a block template to a for loop. Cheers, Nick. -- Nick Coghlan | [EMAIL PROTECTED] | Brisbane, Australia --- http://boredomandlaziness.skystorm.net ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- loose ends
Guido van Rossum wrote: > 1. I still can't decide on keyword vs. no keyword, but if we're going > to have a keyword, I haven't seen a better proposal than block. So > it's either block or nothing. I'll sleep on this. Feel free to start > an all-out flame war on this in c.l.py. ;-) I quite like 'block', but can live with no keyword (since it then becomes a practical equivalent to user-defined statements). > 2. No else clause; the use case is really weak and there are too many > possible semantics. It's not clear whether to generalize from > for/else, or if/else, or what else. Agreed. The order I posted my list of semantic options was the order I thought of them, but I ended up agreeing with the votes Aahz posted. > 3. I'm leaning against Phillip's proposal; IMO it adds more complexity > for very little benefit. See my response to Phillip. I think there could be an advantage to it if it means that "for l in synchronized(lock)" raises an immediate error instead of silently doing the wrong thing. Cheers, Nick. -- Nick Coghlan | [EMAIL PROTECTED] | Brisbane, Australia --- http://boredomandlaziness.skystorm.net ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- loose ends
Nick Coghlan a écrit : >>3. I'm leaning against Phillip's proposal; IMO it adds more complexity >>for very little benefit. > > > See my response to Phillip. I think there could be an advantage to it if it > means that "for l in synchronized(lock)" raises an immediate error instead of > silently doing the wrong thing. First, I really think this PEP is needed for Python. But this is express exactly my main concern about it ! As far as I understand it, iterator-for-blocks and iterator-for-loops are two different beasts. Even if iterator-for-loops can be used within a block without damage, the use of iterator-for-block in a loop can lead to completely unpredictable result (and result really hard to find since they'll possibly involve race conditions or dead locks). To try being as clear as possible, I would say the iterator-for-loops are simplified iterator-for-blocks. IOW, if I were to put them in a class inheritance hierarchy (I don't say we should put them into one ;) ) iterator-for-block would be the base class of iterator-for-loop. Thus, as for-loops require an iterator-for-loop, they would raise an error if used with an iterator-for-block. But as blocks require an iterator-for-blocks they will allow iterator-for-loops too ! Cheers, Pierre -- Pierre Barbier de Reuille INRA - UMR Cirad/Inra/Cnrs/Univ.MontpellierII AMAP Botanique et Bio-informatique de l'Architecture des Plantes TA40/PSII, Boulevard de la Lironde 34398 MONTPELLIER CEDEX 5, France tel : (33) 4 67 61 65 77fax : (33) 4 67 61 56 68 ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Need to hook Py_FatalError
Greetings, Currently Py_FatalError only dumps the error to stderr and calls abort(). When doing quirky things with the interpreter, it's so annoying that process just terminates. Are there any reason why we still dont have a simple callback to hook Py_FatalError. PS. If the answer is "because no one needs/implemented...", I can volunteer. Best regards. ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- loose ends
Paul Svensson wrote: > On Tue, 3 May 2005, Nick Coghlan wrote: > >> I'd also suggest that the blocktemplate decorator accept any iterator, >> not just >> generators. > > > So you want decorators on classes now ? A decorator is just a function - it doesn't *need* to be used with decorator syntax. I just think the following code should work for any iterator: block blocktemplate(itr): # Do stuff Cheers, Nick. -- Nick Coghlan | [EMAIL PROTECTED] | Brisbane, Australia --- http://boredomandlaziness.skystorm.net ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- loose ends
Pierre Barbier de Reuille wrote: > Even if iterator-for-loops can be used within a block without damage, > the use of iterator-for-block in a loop can lead to completely > unpredictable result (and result really hard to find since they'll > possibly involve race conditions or dead locks). I had a longish post written before I realised I'd completely misunderstood your comment. You were actually agreeing with me, so most of my post was totally beside the point. Anyway, to summarise the argument in favour of separate API's for iterators and block templates, the first code example below is a harmless quirk (albeit an irritating violation of TOOWTDI). The second and third examples are potentially serious bugs: block range(10) as i: # Just a silly way to write "for i in range(10)" for f in opening(name): # When f gets closed is Python implementation dependent for lock in synchronized(mylock): # When lock gets released is Python implementation dependent Cheers, Nick. P.S. Dear lord, synchronized is an aggravating name for that function. I keep wanting to spell it with a second letter 's', like any civilised person ;) -- Nick Coghlan | [EMAIL PROTECTED] | Brisbane, Australia --- http://boredomandlaziness.skystorm.net ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- loose ends
I've been away for a while and just read through the PEP 340 discussion
with growing amazement.
Pierre Barbier de Reuille wrote:
> As far as I understand it,
> iterator-for-blocks and iterator-for-loops are two different beasts.
Right!
> To try being as clear as possible, I would say the iterator-for-loops
> are simplified iterator-for-blocks. IOW, if I were to put them in a
> class inheritance hierarchy (I don't say we should put them into one ;)
> ) iterator-for-block would be the base class of iterator-for-loop.
> Thus,
> as for-loops require an iterator-for-loop, they would raise an error if
> used with an iterator-for-block. But as blocks require an
> iterator-for-blocks they will allow iterator-for-loops too !
IMHO It is more like round holes and square pegs (or the other way
around).
What PEP 340 seems to be trying to achieve is a generic mechanism to
define templates with holes/place holders for blocks of code. That
gives two nouns ('template' and 'code block') that both qualify as
indicators of reusable items.
We can use standard functions as reusable code blocks. Wouldn't a
template then be just a function that takes other functions ar
arguments? All information transfer between the template and its
arguments is via the parameter list/returned values.
What am I missing?
--eric
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- loose ends
Nick Coghlan <[EMAIL PROTECTED]> writes: > Paul Svensson wrote: >> On Tue, 3 May 2005, Nick Coghlan wrote: >> >>> I'd also suggest that the blocktemplate decorator accept any iterator, >>> not just >>> generators. >> >> >> So you want decorators on classes now ? > > A decorator is just a function - it doesn't *need* to be used with decorator > syntax. I just think the following code should work for any iterator: > >block blocktemplate(itr): > # Do stuff But in @blocktemplate def foo(...): ... blocktemplate isn't passed an iterator, it's passed a callable that returns an iterator. Cheers, mwh -- . <- the pointyour article -> . |- a long way | -- Christophe Rhodes, ucam.chat ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] PEP 340: Breaking out.
I have a question/suggestion about PEP 340. As I read the PEP right now, the code: while True: block synchronized(v1): if v1.field: break time.sleep(1) Will never break out of the enclosing while loop. This is because the break breaks the while loop that the block statement is translated into, instead of breaking the outer True statement. Am I understanding this right, or am I misunderstanding this? If I am understanding this right, I would suggest allowing some way of having the iterator call continue or break in the enclosing context. (Perhaps by enclosing the entire translation of block in a try-except construct, which catches Stop and Continue exceptions raised by the generator and re-raises them in the outer context.) I hope this helps. -- Tom Rothamel --- http://www.rothamel.us/~tom/ ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340: Breaking out.
Tom Rothamel a écrit : > I have a question/suggestion about PEP 340. > > As I read the PEP right now, the code: > > > while True: > > block synchronized(v1): > if v1.field: > break > > time.sleep(1) > > > Will never break out of the enclosing while loop. This is because the > break breaks the while loop that the block statement is translated > into, instead of breaking the outer True statement. Well, that's exactly what it is intended to do and what I would expect it to do ! break/continue affect only the inner-most loop. > > Am I understanding this right, or am I misunderstanding this? > > If I am understanding this right, I would suggest allowing some way of > having the iterator call continue or break in the enclosing > context. (Perhaps by enclosing the entire translation of block in a > try-except construct, which catches Stop and Continue exceptions > raised by the generator and re-raises them in the outer context.) > > I hope this helps. > I don't want it like that ! This would differ with the break/continue used in other loops. If you need to break from many loops, enclose them into a function and return from it ! Pierre -- Pierre Barbier de Reuille INRA - UMR Cirad/Inra/Cnrs/Univ.MontpellierII AMAP Botanique et Bio-informatique de l'Architecture des Plantes TA40/PSII, Boulevard de la Lironde 34398 MONTPELLIER CEDEX 5, France tel : (33) 4 67 61 65 77fax : (33) 4 67 61 56 68 ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340: Breaking out.
> "Pierre" == Pierre Barbier de Reuille <[EMAIL PROTECTED]> writes: Pierre> Tom Rothamel a écrit : >> I have a question/suggestion about PEP 340. >> >> As I read the PEP right now, the code: >> >> while True: >> block synchronized(v1): >> if v1.field: >> break >> time.sleep(1) >> >> Will never break out of the enclosing while loop. Pierre> Well, that's exactly what it is intended to do and what I would Pierre> expect it to do ! break/continue affect only the inner-most Pierre> loop. Yeah, but "block synchronized(v1)" doesn't look like a loop. I think this might be a common stumbling block for people using this construct. Skip ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
I just made a first reading of the PEP and want to clarify my understanding of how it fits with existing concepts. Is it correct to say that "continue" parallel's its current meaning and returns control upwards (?outwards) to the block iterator that called it? Likewise, is it correct that "yield" is anti-parallel to the current meaning? Inside a generator, it returns control upwards to the caller. But inside a block-iterator, it pushes control downwards (?inwards) to the block it controls. Is the distinction between block iterators and generators similar to the Gang-of-Four's distinction between external and internal iterators? Are there some good use cases that do not involve resource locking? IIRC, that same use case was listed a prime motivating example for decorators (i.e. @syncronized). TOOWTDI suggests that a single use case should not be used to justify multiple, orthogonal control structures. It would be great if we could point to some code in the standard library or in a major Python application that would be better (cleaner, faster, or clearer) if re-written using blocks and block-iterators. I've scanned through the code base looking for some places to apply the idea and have come up empty handed. This could mean that I've not yet grasped the essence of what makes the idea useful or it may have other implications such as apps needing to be designed from the ground-up with block iterators in mind. Raymond ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340: Breaking out.
Skip Montanaro a écrit : > [...] > > Yeah, but "block synchronized(v1)" doesn't look like a loop. I think this > might be a common stumbling block for people using this construct. > > Skip > Well, this can be a problem, because indeed the black-statement introduce a new loop construct in Python. That's why I advocated some time ago against the introduction of a new name. IMHO, the for-loop syntax can be really used instead of blocks as its behavior if exactly the one of a for-loop if the iterator is an iterator-for-for and the current for-loop cannot be used with iterator-for-blocks. The main problem with this syntax is the use of the blocks for things that are not loops (like the synchronize object)! And they are, indeed, quite common ! (or they will be :) ). Pierre -- Pierre Barbier de Reuille INRA - UMR Cirad/Inra/Cnrs/Univ.MontpellierII AMAP Botanique et Bio-informatique de l'Architecture des Plantes TA40/PSII, Boulevard de la Lironde 34398 MONTPELLIER CEDEX 5, France tel : (33) 4 67 61 65 77fax : (33) 4 67 61 56 68 ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
Hi, Sounds like a useful requirement to have for new features in 2.x, IMO. that is... "demonstrated need". If the feature implies that the app needs to be designed from the ground up to *really* take advantage of the feature, then, maybe leave it for Guido's sabbatical (e.g. Python 3000). On 5/3/05, Raymond Hettinger <[EMAIL PROTECTED]> wrote: > It would be great if we could point to some code in the standard library > or in a major Python application that would be better (cleaner, faster, > or clearer) if re-written using blocks and block-iterators. I've > scanned through the code base looking for some places to apply the idea > and have come up empty handed. This could mean that I've not yet > grasped the essence of what makes the idea useful or it may have other > implications such as apps needing to be designed from the ground-up with > block iterators in mind. > > Raymond -- LD Landis - N0YRQ - from the St Paul side of Minneapolis ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Need to hook Py_FatalError
> Currently Py_FatalError only dumps the error to stderr and calls abort(). > When doing quirky things with the interpreter, it's so annoying that process > just terminates. Are there any reason why we still dont have a simple > callback to hook Py_FatalError. > > PS. If the answer is "because no one needs/implemented...", I can volunteer. Your efforts would be better directed towards fixing the causes of the fatal errors. I see no need to hook Py_FatalError, but since it's open source, you are of course free to patch your own copy if your urge is truly irresistible. Or I guess you could run Python under supervision of gdb and trap it that way. But tell me, what do you want the process to do instead of terminating? Py_FatalError is used in situations where raising an exception is impossible or would do more harm than good. -- --Guido van Rossum (home page: http://www.python.org/~guido/) ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Need to hook Py_FatalError
"m.u.k" <[EMAIL PROTECTED]> wrote: > Currently Py_FatalError only dumps the error to stderr and calls abort(). > When doing quirky things with the interpreter, it's so annoying that process > just terminates. Are there any reason why we still dont have a simple > callback to hook Py_FatalError. > > PS. If the answer is "because no one needs/implemented...", I can volunteer. In looking at the use of Py_FatalError in the Python Sources (it's a 10 meg tarball that is well worth the download), it looks as though its use shows a Fatal error (hence the name). Things like "Inconsistant interned string state" or "Immortal interned string died" or "Can't initialize type", etc. Essentially, those errors generally signify "the internal state of python is messed up", whether that be by C extension, or even a bug in Python. The crucial observation is that many of them have ambiguous possible recoveries. How do you come back from "Can't initialize type", or even 'gc couldn't allocate "__del__"'? When you have individual solutions to some subset of the uses of Py_FatalError, then it would make sense to offer those solutions as a replacement to Py_FatalError use in those situations (also showing that the errors are not actually fatal), rather than to ask for a hook to hook all (by definition) fatal errors. - Josiah ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
[Raymond Hettinger] > I just made a first reading of the PEP and want to clarify my > understanding of how it fits with existing concepts. Thanks! Now is about the right time -- all the loose ends are being solidified (in my mind any way). > Is it correct to say that "continue" parallel's its current meaning and > returns control upwards (?outwards) to the block iterator that called > it? I have a hard time using directions as metaphors (maybe because on some hardware, stacks grow down) unless you mean "up in the source code" which doesn't make a lot of sense either in this context. But yes, continue does what you expect it to do in a loop. Of course, in a resource allocation block, continue and break are pretty much the same (just as they are in any loop that you know has only one iteration). > Likewise, is it correct that "yield" is anti-parallel to the current > meaning? Inside a generator, it returns control upwards to the caller. > But inside a block-iterator, it pushes control downwards (?inwards) to > the block it controls. I have a hard time visualizing the difference. They feel the same to me, and the implementation (from the generator's POV) is identical: yield suspends the current frame, returning to the previous frame from the call to next() or __next__(), and the suspended frame can be resumed by calling next() / __next__() again. > Is the distinction between block iterators and generators similar to the > Gang-of-Four's distinction between external and internal iterators? I looked it up in the book (p. 260), and I think generators have a duality to them that makes the distinction useless, or at least relative to your POV. With a classic for-loop driven by a generator, the author of the for-loop thinks of it as an external iterator -- you ask for the next item using the (implicit) call to next(). But the author of the generator thinks of it as an internal iterator -- the for loop resumes only when the generator feels like it. > Are there some good use cases that do not involve resource locking? > IIRC, that same use case was listed a prime motivating example for > decorators (i.e. @syncronized). TOOWTDI suggests that a single use case > should not be used to justify multiple, orthogonal control structures. Decorators don't need @synchronized as a motivating use case; there are plenty of other use cases. Anyway, @synchronized was mostly a demonstration toy; whole method calls are rarely the right granularity of locking. (BTW in the latest version of PEP 340 I've renamed synchronized to locking; many people complained about the strange Javaesque term.) Look at the examples in the PEP (version 1.16) for more use cases. > It would be great if we could point to some code in the standard library > or in a major Python application that would be better (cleaner, faster, > or clearer) if re-written using blocks and block-iterators. I've > scanned through the code base looking for some places to apply the idea > and have come up empty handed. This could mean that I've not yet > grasped the essence of what makes the idea useful or it may have other > implications such as apps needing to be designed from the ground-up with > block iterators in mind. I presume you mentally discarded the resource allocation use cases where the try/finally statement was the outermost statement in the function body, since those would be helped by @synchronized; but look more closely at Queue, and you'll find that the two such methods use different locks! Also the use case for closing a file upon leaving a block, while clearly a resource allocation use case, doesn't work well with a decorator. I just came across another use case that is fairly common in the standard library: redirecting sys.stdout. This is just a beauty (in fact I'll add it to the PEP): def saving_stdout(f): save_stdout = sys.stdout try: sys.stdout = f yield finally: sys.stdout = save_stdout -- --Guido van Rossum (home page: http://www.python.org/~guido/) ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340: Breaking out.
[Skip Montanaro] > Yeah, but "block synchronized(v1)" doesn't look like a loop. I think this > might be a common stumbling block for people using this construct. How many try/finally statements have you written inside a loop? In my experience this is extrmely rare. I found no occurrences in the standard library. -- --Guido van Rossum (home page: http://www.python.org/~guido/) ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
At 09:53 AM 5/3/05 -0700, Guido van Rossum wrote:
>I just came across another use case that is fairly common in the
>standard library: redirecting sys.stdout. This is just a beauty (in
>fact I'll add it to the PEP):
>
>def saving_stdout(f):
Very nice; may I suggest 'redirecting_stdout' as the name instead?
This and other examples from the PEP still have a certain awkwardness of
phrasing in their names. A lot of them seem to cry out for a "with"
prefix, although maybe that's part of the heritage of PEP 310. But Lisp
has functions like 'with-open-file', so I don't think that it's *all* a PEP
310 influence on the examples.
It also seems to me that it would be nice if locks, files, sockets and
similar resources would implement the block-template protocol; then one
could simply say:
block self.__lock:
...
or:
open("foo") as f:
...
And not need any special wrappers. Of course, this could only work for
files if the block-template protocol were distinct from the normal
iteration protocol.
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
On Tue, May 03, 2005, Phillip J. Eby wrote: > At 09:53 AM 5/3/05 -0700, Guido van Rossum wrote: >> >>I just came across another use case that is fairly common in the >>standard library: redirecting sys.stdout. This is just a beauty (in >>fact I'll add it to the PEP): >> >>def saving_stdout(f): > > Very nice; may I suggest 'redirecting_stdout' as the name instead? You may; I'd nitpick that to either "redirect_stdout" or "redirected_stdout". "redirecting_stdout" is slightly longer and doesn't have quite the right flavor to my eye. I might even go for "make_stdout" or "using_stdout"; that relies on people understanding that a block means temporary usage. > This and other examples from the PEP still have a certain awkwardness > of phrasing in their names. A lot of them seem to cry out for a > "with" prefix, although maybe that's part of the heritage of PEP 310. > But Lisp has functions like 'with-open-file', so I don't think that > it's *all* a PEP 310 influence on the examples. Yes, that's why I've been pushing for "with". -- Aahz ([EMAIL PROTECTED]) <*> http://www.pythoncraft.com/ "It's 106 miles to Chicago. We have a full tank of gas, a half-pack of cigarettes, it's dark, and we're wearing sunglasses." "Hit it." ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
On May 3, 2005, at 12:53 PM, Guido van Rossum wrote: > def saving_stdout(f): > save_stdout = sys.stdout > try: > sys.stdout = f > yield > finally: > sys.stdout = save_stdout I hope you aren't going to be using that in any threaded program. That's one really nice thing about lisp's dynamic variables: they automatically interact properly with threads. (defvar *foo* nil) (let ((*foo* 5)) ; *foo* has value of 5 for all functions called from here, but only in this thread. In other threads it'll still be nil. ) ; *foo* has gone back to nil. James ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Need to hook Py_FatalError
Hi, Guido van Rossum <[EMAIL PROTECTED]> wrote in news:[EMAIL PROTECTED]: > Your efforts would be better directed towards fixing the causes of the > fatal errors. > > I see no need to hook Py_FatalError, but since it's open source, you > are of course free to patch your own copy if your urge is truly > irresistible. Or I guess you could run Python under supervision of gdb > and trap it that way. Well, I admit it is a bit triva(as its implementation), at least nobody wanted it within Python's 10+ lifetime. Indeed Im using my own patched copy, I just thought it'd be good some other naughty boy playing dangerous games with interpreter internals not spend hours in debugger trying to reproduce the crash. > But tell me, what do you want the process to do instead of > terminating? Py_FatalError is used in situations where raising an > exception is impossible or would do more harm than good. The need for this is only logging purposes. eg the process just terminates on client machine, you have no logs, no clues(except a coredump), nightmare!. Some sort of log would be invaluable here. Best regards. ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Need to hook Py_FatalError
On Tue, May 03, 2005 at 09:15:42AM -0700, Guido van Rossum wrote: > But tell me, what do you want the process to do instead of > terminating? Py_FatalError is used in situations where raising an > exception is impossible or would do more harm than good. In an application which embeds Python, I want to show the application's standard error dialog, which doesn't call any Python APIs (but does do things like capture the call stack at the time of the error). For this use, it doesn't matter that no further calls to those APIs are possible. Jeff pgpSD1Dnj4d6g.pgp Description: PGP signature ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Need to hook Py_FatalError
Hi, Josiah Carlson <[EMAIL PROTECTED]> wrote in news:[EMAIL PROTECTED]: > In looking at the use of Py_FatalError in the Python Sources (it's a 10 > meg tarball that is well worth the download), it looks as though its use > shows a Fatal error (hence the name). Things like "Inconsistant > interned string state" or "Immortal interned string died" or "Can't > initialize type", etc. > > Essentially, those errors generally signify "the internal state of > python is messed up", whether that be by C extension, or even a bug in > Python. The crucial observation is that many of them have ambiguous > possible recoveries. How do you come back from "Can't initialize type", > or even 'gc couldn't allocate "__del__"'? The hook is not to come back just for logging, see my previous post please. Best regards. ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340: Breaking out.
> "Guido" == Guido van Rossum <[EMAIL PROTECTED]> writes: Guido> [Skip Montanaro] >> Yeah, but "block synchronized(v1)" doesn't look like a loop. I think >> this might be a common stumbling block for people using this >> construct. Guido> How many try/finally statements have you written inside a loop? Guido> In my experience this is extrmely rare. I found no Guido> occurrences in the standard library. How'd we start talking about try/finally? To the casual observer, this looks like "break" should break out of the loop: while True: block synchronized(v1): ... if v1.field: break time.sleep(1) The PEP says: Note that it is left in the middle whether a block-statement represents a loop or not; this is up to the iterator, but in the most common case BLOCK1 is executed exactly once. That suggests to me it's still not clear if the block statement is actually a looping statement. If not, then "break" should almost certainly break out of the while loop. BTW, what did you mean by "left in the middle" mean? I interpreted it as "still undecided", but it's an idiom I've never seen. Perhaps it should be replaced by something more clear. Skip ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
[Raymond] > > Likewise, is it correct that "yield" is anti-parallel to the current > > meaning? Inside a generator, it returns control upwards to the caller. > > But inside a block-iterator, it pushes control downwards (?inwards) to > > the block it controls. [Guido van Rossum] > I have a hard time visualizing the difference. They feel the same to > me, and the implementation (from the generator's POV) is identical: > yield suspends the current frame, returning to the previous frame from > the call to next() or __next__(), and the suspended frame can be > resumed by calling next() / __next__() again. This concept ought to be highlighted in the PEP because it explains clearly what "yield" does and it may help transition from a non-Dutch mental model. I expect that many folks (me included) think in terms of caller vs callee with a parallel spatial concept of enclosing vs enclosed. In that model, the keywords "continue", "break", "yield", and "return" all imply a control transfer from the enclosed back to the encloser. In contrast, the new use of yield differs in that the suspended frame transfers control from the encloser to the enclosed. > > Are there some good use cases that do not involve resource locking? > > IIRC, that same use case was listed a prime motivating example for > > decorators (i.e. @syncronized). TOOWTDI suggests that a single use case > > should not be used to justify multiple, orthogonal control structures. > > Decorators don't need @synchronized as a motivating use case; there > are plenty of other use cases. No doubt about that. > Anyway, @synchronized was mostly a demonstration toy; whole method > calls are rarely the right granularity of locking. Agreed. Since that is the case, there should be some effort to shift some of the examples towards real use cases where a block-iterator is the appropriate solution. It need not hold-up releasing the PEP to comp.lang.python, but it would go a long way towards improving the quality of the subsequent discussion. > (BTW in the latest > version of PEP 340 I've renamed synchronized to locking; many people > complained about the strange Javaesque term.) That was diplomatic. Personally, I find it amusing when there is an early focus on naming rather than on functionality, implementation issues, use cases, usability, and goodness-of-fit within the language. > > It would be great if we could point to some code in the standard library > > or in a major Python application that would be better (cleaner, faster, > > or clearer) if re-written using blocks and block-iterators > look > more closely at Queue, and you'll find that the two such methods use > different locks! I don't follow this one. Tim's uses of not_empty and not_full are orthogonal (pertaining to pending gets at one end of the queue and to pending puts at the other end). The other use of the mutex is independent of either pending puts or gets; instead, it is a weak attempt to minimize what can happen to the queue during a size query. While the try/finallys could get factored-out into separate blocks, I do not see how the code could be considered better off. There is a slight worsening of all metrics of merit: line counts, total number of function defs, number of calls, or number of steps executed outside the lock (important given that the value a query result declines rapidly once the lock is released). > Also the use case for closing a file upon leaving a block, while > clearly a resource allocation use case, doesn't work well with a > decorator. Right. > I just came across another use case that is fairly common in the > standard library: redirecting sys.stdout. This is just a beauty (in > fact I'll add it to the PEP): > > def saving_stdout(f): > save_stdout = sys.stdout > try: > sys.stdout = f > yield > finally: > sys.stdout = save_stdout This is the strongest example so far. When adding it to the PEP, it would be useful to contrast the code with simpler alternatives like PEP 288's g.throw() or PEP 325's g.close(). On the plus side, the block-iterator approach factors out code common to multiple callers. On the minus side, the other PEPs involve simpler mechanisms and their learning curve would be nearly zero. These pluses and minuses are important because apply equally to all examples using blocks for initialization/finalization. Raymond ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340: Breaking out.
[Skip Montanaro] > >> Yeah, but "block synchronized(v1)" doesn't look like a loop. I think > >> this might be a common stumbling block for people using this > >> construct. > > Guido> How many try/finally statements have you written inside a loop? > Guido> In my experience this is extrmely rare. I found no > Guido> occurrences in the standard library. [Skip again] > How'd we start talking about try/finally? Because it provides by far the dominant use case for 'block'. The block-statement is intended to replace many boilerplace uses of try/finally. In addition, it's also a coroutine invocation primitive. > To the casual observer, this > looks like "break" should break out of the loop: > > while True: > block synchronized(v1): > ... > if v1.field: > break > time.sleep(1) Without 'block' this would be written as try/finally. And my point is that people just don't write try/finally inside a while loop very often (I found *no* examples in the entire standard library). > The PEP says: > > Note that it is left in the middle whether a block-statement > represents a loop or not; this is up to the iterator, but in the > most common case BLOCK1 is executed exactly once. > > That suggests to me it's still not clear if the block statement is actually > a looping statement. If not, then "break" should almost certainly break out > of the while loop. Dynamically, it's most likely not a loop. But the compiler doesn't know that, so the compiler considers it a loop. > BTW, what did you mean by "left in the middle" mean? I interpreted it as > "still undecided", but it's an idiom I've never seen. Perhaps it should be > replaced by something more clear. It may be a Dutch phrase that doesn't translate to English as wel as I thought. It doesn't exactly mean "still undecided" but more "depends on your POV". I'll use something different, and also clarify that as far as break/continue are concerned, it *is* a loop. -- --Guido van Rossum (home page: http://www.python.org/~guido/) ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Need to hook Py_FatalError
"m.u.k" <[EMAIL PROTECTED]> wrote: > > Hi, > > Guido van Rossum <[EMAIL PROTECTED]> wrote in > news:[EMAIL PROTECTED]: > > > Your efforts would be better directed towards fixing the causes of the > > fatal errors. > > > > I see no need to hook Py_FatalError, but since it's open source, you > > are of course free to patch your own copy if your urge is truly > > irresistible. Or I guess you could run Python under supervision of gdb > > and trap it that way. > > Well, I admit it is a bit triva(as its implementation), at least nobody > wanted it within Python's 10+ lifetime. Indeed Im using my own patched copy, > I just thought it'd be good some other naughty boy playing dangerous games > with interpreter internals not spend hours in debugger trying to reproduce > the crash. > > > But tell me, what do you want the process to do instead of > > terminating? Py_FatalError is used in situations where raising an > > exception is impossible or would do more harm than good. > > The need for this is only logging purposes. eg the process just terminates > on client machine, you have no logs, no clues(except a coredump), nightmare!. > Some sort of log would be invaluable here. Offering any hook for Py_FatalError may not even be enough, as some of those errors are caused by insufficient memory. What if a hook were available, but it couldn't be called because there wasn't enough memory? Of course there is the option of pre-allocating a few kilobytes, then just before one calls the hook, freeing that memory so that the hook can execute (assuming the hook is small enough). I'm not sure if this is a desireable general mechanic, but it may be sufficient for you. If you do figure out a logging mechanism that is almost guaranteed to execute on FatalError, post it to sourceforge. - Josiah ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
> [Raymond] > > > Likewise, is it correct that "yield" is anti-parallel to the current > > > meaning? Inside a generator, it returns control upwards to the caller. > > > But inside a block-iterator, it pushes control downwards (?inwards) to > > > the block it controls. > > [Guido van Rossum] > > I have a hard time visualizing the difference. They feel the same to > > me, and the implementation (from the generator's POV) is identical: > > yield suspends the current frame, returning to the previous frame from > > the call to next() or __next__(), and the suspended frame can be > > resumed by calling next() / __next__() again. [Raymond] > This concept ought to be highlighted in the PEP because it explains > clearly what "yield" does and it may help transition from a non-Dutch > mental model. I expect that many folks (me included) think in terms of > caller vs callee with a parallel spatial concept of enclosing vs > enclosed. In that model, the keywords "continue", "break", "yield", and > "return" all imply a control transfer from the enclosed back to the > encloser. I'm still confused and surprised that you think I need to explain what yield does, since the PEP doesn't change one bit about this. The encloser/enclosed parallel to caller/callee doesn't make sense to me; but that may just because I'm Dutch. > In contrast, the new use of yield differs in that the suspended frame > transfers control from the encloser to the enclosed. Why does your notion of who encloses whom suddenly reverse when you go from a for-loop to a block-statement? This all feels very strange to me. > > Anyway, @synchronized was mostly a demonstration toy; whole method > > calls are rarely the right granularity of locking. > > Agreed. Since that is the case, there should be some effort to shift > some of the examples towards real use cases where a block-iterator is > the appropriate solution. It need not hold-up releasing the PEP to > comp.lang.python, but it would go a long way towards improving the > quality of the subsequent discussion. Um? I thought I just showed that locking *is* a good use case for the block-statement and you agreed; now why would I have to move away from it? I think I'm thoroughly confused by your critique of the PEP. Perhaps you could suggest some concrete rewritings to knock me out of my confusion? > Personally, I find it amusing when there is an > early focus on naming rather than on functionality, implementation > issues, use cases, usability, and goodness-of-fit within the language. Well, the name of a proposed concept does a lot to establish its first impression. First imressions matter! > > > It would be great if we could point to some code in the standard library > > > or in a major Python application that would be better (cleaner, faster, > > > or clearer) if re-written using blocks and block-iterators > > > look > > more closely at Queue, and you'll find that the two such methods use > > different locks! > > I don't follow this one. Tim's uses of not_empty and not_full are > orthogonal (pertaining to pending gets at one end of the queue and to > pending puts at the other end). The other use of the mutex is > independent of either pending puts or gets; instead, it is a weak > attempt to minimize what can happen to the queue during a size query. I meant to use this as an example of the unsuitability of the @synchronized decorator, since it implies that all synchronization is on the same mutex, thereby providing a use case for the locking block-statement. I suspect we're violently in agreement though. > While the try/finallys could get factored-out into separate blocks, I do > not see how the code could be considered better off. There is a slight > worsening of all metrics of merit: line counts, total number of > function defs, number of calls, or number of steps executed outside the > lock (important given that the value a query result declines rapidly > once the lock is released). I don't see how the line count metric would lose: a single "locking()" primitive exported by the threading module would be usable by all code that currently uses try/finally to acquire and release a lock. Performance needn't suffer either, if the locking() primitive is implemented in C (it could be a straightforward translation of example 6 into C). > > I just came across another use case that is fairly common in the > > standard library: redirecting sys.stdout. This is just a beauty (in > > fact I'll add it to the PEP): > > > > def saving_stdout(f): > > save_stdout = sys.stdout > > try: > > sys.stdout = f > > yield > > finally: > > sys.stdout = save_stdout > > This is the strongest example so far. When adding it to the PEP, it > would be useful to contrast the code with simpler alternatives like PEP > 288's g.throw() or PEP 325's g.close(). On the plus side, the > block-iterator approach factors out code common to multiple callers. On > the minus side, the ot
Re: [Python-Dev] Need to hook Py_FatalError
Hi, Josiah Carlson <[EMAIL PROTECTED]> wrote in news:[EMAIL PROTECTED]: > Offering any hook for Py_FatalError may not even be enough, as some of > those errors are caused by insufficient memory. What if a hook were > available, but it couldn't be called because there wasn't enough memory? > > Of course there is the option of pre-allocating a few kilobytes, then > just before one calls the hook, freeing that memory so that the hook can > execute (assuming the hook is small enough). I'm not sure if this is a > desireable general mechanic, but it may be sufficient for you. If you > do figure out a logging mechanism that is almost guaranteed to execute > on FatalError, post it to sourceforge. IMHO this should be left to hooker(apparerently not right word, but you get the point :) ). If he allocates more mem. or does heavy stuff, that will just fail. Anyway abort() is a failure too. Either abort() will end the process or OS will on such a critical error. Best regards. ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] PEP 340 -- concept clarification
[Raymond Hettinger]
>> Likewise, is it correct that "yield" is anti-parallel to the current
>> meaning? Inside a generator, it returns control upwards to the caller.
>> But inside a block-iterator, it pushes control downwards (?inwards) to
>> the block it controls.
Guido:
> I have a hard time visualizing the difference.
In a normal generator, someone makes a call to establish the
generator, which then becomes a little island -- anyone can call
the generator, and it returns control back to whoever made the last call.
With the block, every yield returns to a single designated callback.
This callback had to be established at the same time the block was
created, and must be textually inside it. (An indented suite to the
"block XXX:" line.)
>> Are there some good use cases that do not involve resource locking?
> Decorators don't need @synchronized as a motivating use case;
> there are plenty of other use cases.
But are there plenty of other use cases for PEP 340?
If not, then why do we need PEP 340? Are decorators not strong
enough, or is it just that people aren't comfortable yet? If it is a
matter of comfort or recipies, then the new construct might have
just as much trouble. (So this one is not a loop, and you can tell
the difference because ... uh, just skip that advanced stuff.)
> Anyway, @synchronized was mostly a demonstration toy; whole
> method calls are rarely the right granularity of locking.
That is an important difference -- though I'm not sure that the critical
part *shouldn't* be broken out into a separate method.
>> I've scanned through the code base looking for some places
>> to apply the idea and have come up empty handed.
> I presume you mentally discarded the resource allocation use
> cases where the try/finally statement was the outermost statement
> in the function body, since those would be helped by @synchronized;
> but look more closely at Queue, and you'll find that the two such
> methods use different locks!
qsize, empty, and full could be done with a lockself decorator.
Effectively, they *are* lockself decorators for the _xxx functions
that subclasses are told to override.
If you're talking about put and get, decorators don't help as much,
but I'm not sure blocks are much better.
You can't replace the outermost try ... finally with a common decorator
because the locks are self variables. A block, by being inside a method,
would be delayed until self exists -- but that outer lock is only a
tiny fraction
of the boilerplate. It doesn't help with
if not block:
if self._STATE():
raise STATEException
elif timeout is None:
while self._STATE():
self.not_STATE.wait()
else:
if timeout < 0:
raise ValueError("'timeout' must be a positive number")
endtime = _time() + timeout
while self._STATE():
remaining = endtime - _time()
if remaining <= 0.0:
raise STATEException
self.not_STATE.wait(remaining)
val = self._RealMethod(item) # OK, the put optimizes out
this and the return
self.not_OTHERSTATE.notify()
return val
I wouldn't object to a helper method, but using a block just to get rid of four
lines (two of which are the literals "try:" and "finally:") seems barely worth
doing, let alone with special new syntax.
> Also the use case for closing a file upon leaving a block, while
> clearly a resource allocation use case, doesn't work well with a
> decorator.
def autoclose(fn):
def outer(filename, *args, **kwargs):
f = open(filename)
val = fn(f, *args, **kwargs)
f.close()
return val
return outer
@autoclose
def f1(f):
for line in f:
print line
> I just came across another use case that is fairly common in the
> standard library: redirecting sys.stdout. This is just a beauty (in
> fact I'll add it to the PEP):
> def saving_stdout(f):
> save_stdout = sys.stdout
> try:
> sys.stdout = f
> yield
> finally:
> sys.stdout = save_stdout
Why does this need a yield? Why not just a regular call to the
function? If you're trying to generalize the redirector, then this
also works as a decorator. The nested functions (and the *args,
**kwargs, if you don't inherit from a standard dedcorator) is a
bit of an annoyance, but I'm not sure the new iterator form will
be any easier to explain.
def saving_stdout(f):
import sys # Just in case...
def captured_stream(fn):
def redirect(*args, **kwargs):
save_stdout = sys.stdout
try:
sys.stdout = f
return fn (*args, **kwargs)
finally:
sys.stdout = save_stdout
return redirect
return captured_stream
o=StringIO()
@saving_stdout(o)
...
_
Re: [Python-Dev] PEP 340 -- concept clarification
[Raymond]
It would be great if we could point to some code in the standard library
or in a major Python application that would be better (cleaner, faster,
or clearer) if re-written using blocks and block-iterators
[Guido]
>>> look more closely at Queue, and you'll find that the two such methods
>>> use different locks!
[Raymond]
>> I don't follow this one. Tim's uses of not_empty and not_full are
>> orthogonal (pertaining to pending gets at one end of the queue and to
>> pending puts at the other end). The other use of the mutex is
>> independent of either pending puts or gets; instead, it is a weak
>> attempt to minimize what can happen to the queue during a size query.
[Guido]
> I meant to use this as an example of the unsuitability of the
> @synchronized decorator, since it implies that all synchronization is
> on the same mutex, thereby providing a use case for the locking
> block-statement.
Queue may be a confusing example. Older versions of Queue did indeed
use more than one mutex. The _current_ (2.4+) version of Queue uses
only one mutex, but shared across two condition variables (`not_empty`
and `not_full` are condvars in current Queue, not locks). Where,
e.g., current Queue.put() starts with
self.not_full.acquire()
it _could_ say
self.not_empty.acquire()
instead with the same semantics, or it could say
self.mutex.acquire()
They all do an acquire() on the same mutex. If put() needs to wait,
it needs to wait on the not_full condvar, so it's conceptually
clearest for put() to spell it the first of these ways.
Because Queue does use condvars now instead of plain locks, I wouldn't
approve of any gimmick purporting to hide the acquire/release's in
put() or get(): that those are visible is necessary to seeing that
the _condvar_ protocol is being followed ("must acquire() before
wait(); must be acquire()'ed during notify(); no path should leave the
condvar acquire()d 'for a long time' before a wait() or release()").
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
> [Raymond Hettinger] > >> Likewise, is it correct that "yield" is anti-parallel to the current > >> meaning? Inside a generator, it returns control upwards to the caller. > >> But inside a block-iterator, it pushes control downwards (?inwards) to > >> the block it controls. > [Guido] > > I have a hard time visualizing the difference. [Jim Jewett] > In a normal generator, someone makes a call to establish the > generator, which then becomes a little island -- anyone can call > the generator, and it returns control back to whoever made the last call. > > With the block, every yield returns to a single designated callback. > This callback had to be established at the same time the block was > created, and must be textually inside it. (An indented suite to the > "block XXX:" line.) Doesn't convince me. The common use for a regular generator is in a for-loop, where every yield also returns to a single designated place (calling it callback is really deceptive!). And with a block, you're free to put the generator call ahead of the block so you can call next() on it manually: it = EXPR1 block it: BLOCK1 is totally equivalent to block EXPR1: BLOCK1 but the first form lets you call next() on it as you please (until the block is exited, for sure). > But are there plenty of other use cases for PEP 340? Yes. Patterns like "do this little dance in a try/finally block" and "perform this tune when you catch an XYZ exception" are pretty common in larger systems and are effectively abstracted away using the block-statement and an appropriate iterator. The try/finally use case often also has some setup that needs to go right before the try (and sometimes some more setup that needs to go *inside* the try). Being able to write this once makes it a lot easier when the "little dance" has to be changed everywhere it is performed. > If not, then why do we need PEP 340? Are decorators not strong > enough, or is it just that people aren't comfortable yet? If it is a > matter of comfort or recipies, then the new construct might have > just as much trouble. (So this one is not a loop, and you can tell > the difference because ... uh, just skip that advanced stuff.) PEP 340 and decorators are totally different things, and the only vaguely common use case would be @synchronized, which is *not* a proper use case for decorators, but "safe locking" is definitely a use case for PEP 340. > > Anyway, @synchronized was mostly a demonstration toy; whole > > method calls are rarely the right granularity of locking. > > That is an important difference -- though I'm not sure that the critical > part *shouldn't* be broken out into a separate method. I'll be the judge of that. I have plenty of examples where breaking it out would create an entirely artificial helper method that takes several arguments just because it needs to use stuff that its caller has set up for it. > > I presume you mentally discarded the resource allocation use > > cases where the try/finally statement was the outermost statement > > in the function body, since those would be helped by @synchronized; > > but look more closely at Queue, and you'll find that the two such > > methods use different locks! > > qsize, empty, and full could be done with a lockself decorator. > Effectively, they *are* lockself decorators for the _xxx functions > that subclasses are told to override. Actually you're pointing out a bug in the Queue module: these *should* be using a try/finally clause to ensure the mutex is released even if the inner call raises an exception. I hadn't noticed these before because I was scanning only for "finally". If a locking primitive had been available, I'm sure it would have been used here. > If you're talking about put and get, decorators don't help as much, > but I'm not sure blocks are much better. > > You can't replace the outermost try ... finally with a common decorator > because the locks are self variables. A block, by being inside a method, > would be delayed until self exists -- but that outer lock is only a > tiny fraction of the boilerplate. It doesn't help with > [...example deleted...] > I wouldn't object to a helper method, but using a block just to get rid of > four > lines (two of which are the literals "try:" and "finally:") seems barely worth > doing, let alone with special new syntax. Well, to me it does; people have been requesting new syntax for this specific case for a long time (that's where PEP 310 is coming from). > > Also the use case for closing a file upon leaving a block, while > > clearly a resource allocation use case, doesn't work well with a > > decorator. > > def autoclose(fn): > def outer(filename, *args, **kwargs): > f = open(filename) > val = fn(f, *args, **kwargs) > f.close() > return val > return outer > > @autoclose > def f1(f): > for line in f: > print line But the auto-closing file, even more than the self-rele
Re: [Python-Dev] PEP 340 -- concept clarification
[Tim]
> Because Queue does use condvars now instead of plain locks, I wouldn't
> approve of any gimmick purporting to hide the acquire/release's in
> put() or get(): that those are visible is necessary to seeing that
> the _condvar_ protocol is being followed ("must acquire() before
> wait(); must be acquire()'ed during notify(); no path should leave the
> condvar acquire()d 'for a long time' before a wait() or release()").
So you think that this would be obscure? A generic condition variable
use could look like this:
block locking(self.condvar):
while not self.items:
self.condvar.wait()
self.process(self.items)
self.items = []
instead of this:
self.condvar.acquire()
try:
while not self.items:
self.condvar.wait()
self.process(self.items)
self.items = []
finally:
self.condvar.release()
I find that the "block locking" version looks just fine; it makes the
scope of the condition variable quite clear despite not having any
explicit acquire() or release() calls (there are some abstracted away
in the wait() call too!).
--
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
... [Jim Jewett] >> qsize, empty, and full could be done with a lockself decorator. >> Effectively, they *are* lockself decorators for the _xxx functions >> that subclasses are told to override. [Guido] > Actually you're pointing out a bug in the Queue module: these *should* > be using a try/finally clause to ensure the mutex is released even if > the inner call raises an exception. Yup! OTOH, if those dead-simple methods raised an exception, the Queue has probably gone wholly insane anyway. > I hadn't noticed these before because I was scanning only for "finally" > > If a locking primitive had been available, I'm sure it would have been > used here. That too. ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- loose ends
Hi Guido, On Mon, May 02, 2005 at 17:55 -0700, Guido van Rossum wrote: > These are the loose ends on the PEP (apart from filling in some > missing sections): > > 1. Decide on a keyword to use, if any. I just read the PEP340 basically the first time so bear with me. First i note that introducing a keyword 'block' would break lots of programs, among it half of PyPy. Unlike many other keywords 'block' is a pretty common variable name. For invoking blocktemplates i like the no-keyword approach, instead. However, i would find it much clearer if *defining* blocktemplates used a new keyword, like: blocktemplate opening(filename, mode="r"): ... because this immediately tells me what the purpose and semantics of the folowing definition is. The original overloading of 'def' to mean generators if the body contains a yield statement was already a matter of discussion (ASFAIK). When i came to Python it was at 2.2 and i remember wondering about this "def" oddity. Extending poor old 'def' functions now to possibly mean block templates gives me semantical overload even if it is justified from an implementation point of view. I am talking purely about (my sense of) code readability here not about implementation. cheers, holger ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] 2 words keyword for block
I'm not really in position to speak but since I only saw people trying to come up with a keyword only using one word and without much success I would venture to suggest the possibility of making a keyword out of two words. Would there be a huge problem to use 2 words to make up a keyword? like for example or if using space is a real problem in template thread_safe(lock): in template redirected_stdout(stream): in template use_and_close_file(path) as file: in template as_transaction(): in template auto_retry(times=3, failas=IOError): ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- loose ends
[Holger] > > 1. Decide on a keyword to use, if any. > > I just read the PEP340 basically the first time so bear with me. Thanks for reviewing! > First i note that introducing a keyword 'block' would break > lots of programs, among it half of PyPy. Unlike many other > keywords 'block' is a pretty common variable name. For > invoking blocktemplates i like the no-keyword approach, instead. Good point (the code from Queue.py quoted by Jim Jewett also uses block as a variable name :-). There has been much argument on both sides. I guess we may need to have a subcommittee to select the keyword (if any) ... Maybe if we can't go without a keyword, 'with' would be okay after all; I'm not so strongly in favor of a Pascal/VB-style with-statement after reading the C# developers' comments (see reference in the PEP). > However, i would find it much clearer if *defining* blocktemplates > used a new keyword, like: > > blocktemplate opening(filename, mode="r"): > ... > > because this immediately tells me what the purpose and semantics > of the folowing definition is. The original overloading of 'def' to > mean generators if the body contains a yield statement was already a > matter of discussion (ASFAIK). When i came to Python it was at 2.2 > and i remember wondering about this "def" oddity. > > Extending poor old 'def' functions now to possibly mean block > templates gives me semantical overload even if it is justified > from an implementation point of view. I am talking purely > about (my sense of) code readability here not about implementation. Hm... Maybe you also want to have separate function and procedure keywords? Or static typing? 'def' can be used to define all sorts of things, that is Python's beauty! -- --Guido van Rossum (home page: http://www.python.org/~guido/) ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
[Tim]
>> Because Queue does use condvars now instead of plain locks, I wouldn't
>> approve of any gimmick purporting to hide the acquire/release's in
>> put() or get(): that those are visible is necessary to seeing that
>> the _condvar_ protocol is being followed ("must acquire() before
>> wait(); must be acquire()'ed during notify(); no path should leave the
>> condvar acquire()d 'for a long time' before a wait() or release()").
[Guido]
> So you think that this would be obscure? A generic condition variable
> use could look like this:
>
>block locking(self.condvar):
>while not self.items:
>self.condvar.wait()
>self.process(self.items)
>self.items = []
>
> instead of this:
>
>self.condvar.acquire()
>try:
>while not self.items:
>self.condvar.wait()
>self.process(self.items)
>self.items = []
>finally:
>self.condvar.release()
>
> I find that the "block locking" version looks just fine; it makes the
> scope of the condition variable quite clear despite not having any
> explicit acquire() or release() calls (there are some abstracted away
> in the wait() call too!).
Actually typing it all out like that makes it hard to dislike .
Yup, that reads fine to me too.
I don't think anyone has mentioned this yet, so I will: library
writers using Decimal (or more generally HW 754 gimmicks) have a need
to fiddle lots of thread-local state ("numeric context"), and must
restore it no matter how the routine exits. Like "boost precision to
twice the user's value over the next 12 computations, then restore",
and "no matter what happens here, restore the incoming value of the
overflow-happened flag". It's just another instance of temporarily
taking over a shared resource, but I think it's worth mentioning that
there are a lot of things "like that" in the world, and to which
decorators don't really sanely apply.
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Need to hook Py_FatalError
"m.u.k" <[EMAIL PROTECTED]> wrote: > Josiah Carlson <[EMAIL PROTECTED]> wrote in > news:[EMAIL PROTECTED]: > > > Offering any hook for Py_FatalError may not even be enough, as some of > > those errors are caused by insufficient memory. What if a hook were > > available, but it couldn't be called because there wasn't enough memory? > > > > Of course there is the option of pre-allocating a few kilobytes, then > > just before one calls the hook, freeing that memory so that the hook can > > execute (assuming the hook is small enough). I'm not sure if this is a > > desireable general mechanic, but it may be sufficient for you. If you > > do figure out a logging mechanism that is almost guaranteed to execute > > on FatalError, post it to sourceforge. > > IMHO this should be left to hooker(apparerently not right word, but you get > the point :) ). If he allocates more mem. or does heavy stuff, that will just > fail. Anyway abort() is a failure too. Either abort() will end the process or > OS will on such a critical error. I'm not talking about doing memory-intensive callbacks, I'm talking about the function call itself. >From what I understand, any function call in Python requires a memory allocation. This is trivially true in the case of rentrant Python calls; which requires the allocation of a frame object from heap memory, and in the case of all calls, from C stack memory. If you cannot allocate a frame for __del__ method calling (one of the error conditions), you certainly aren't going to be able to call a Python callback (no heap memory), and may not have enough stack memory required by your logging function; even if it is written in C (especially if you construct a nontrivial portion of the message in memory before it is printed). If I'm wrong, I'd like to hear it, but I'm still waiting for your patch on sourceforge. - Josiah ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- loose ends
[Guido] > [Holger] > > However, i would find it much clearer if *defining* blocktemplates > > used a new keyword, like: > > > > blocktemplate opening(filename, mode="r"): > > ... > > > > because this immediately tells me what the purpose and semantics > > of the folowing definition is. The original overloading of 'def' to > > mean generators if the body contains a yield statement was already a > > matter of discussion (ASFAIK). When i came to Python it was at 2.2 > > and i remember wondering about this "def" oddity. > > > > Extending poor old 'def' functions now to possibly mean block > > templates gives me semantical overload even if it is justified > > from an implementation point of view. I am talking purely > > about (my sense of) code readability here not about implementation. > > Hm... Maybe you also want to have separate function and procedure > keywords? Or static typing? 'def' can be used to define all sorts of > things, that is Python's beauty! Sure, 'def' is nice and i certainly wouldn't introduce a new keyword for adding e.g. static typing to function 'defs'. But for my taste, blocktemplates derive enough from the old-style function/sub-routine notion that many people still think of when seing a 'def'. When (new) people would see something like 'blocktemplate ...:' they know they have to look it up in the language documentation or in some book under 'blocktemplate' instead of trying to figure out (what the hell) this "function" or "generator" does and how they can use it. Or they might simply think they can invoke it from a for-loop which - as far as i understand - could lead to silent errors, no? Let me add that with the growing number of Python programmers (as stated in your Pycon2005 keynote) it seems to make sense to increase emphasis on how new syntax/concepts will be viewed/used by possibly 100'dreds of thousands of programmers already familiar with (some version of) Python. But i also see your point of confronting people with the fact that Python has a nice unified 'def' statement so i guess it's a balancing act. cheers, holger ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
> > In contrast, the new use of yield differs in that the suspended frame > > transfers control from the encloser to the enclosed. > > Why does your notion of who encloses whom suddenly reverse when you go > from a for-loop to a block-statement? This all feels very strange to > me. After another reading of the PEP, it seems fine. On the earlier readings, the "yield" felt disorienting because the body of the block is subordinate to the block-iterator yet its code is co-located with the caller (albeit set-off with a colon and indentation). > I meant to use this as an example of the unsuitability of the > @synchronized decorator, since it implies that all synchronization is > on the same mutex, thereby providing a use case for the locking > block-statement. > > I suspect we're violently in agreement though. Right :-) > > This is the strongest example so far. When adding it to the PEP, it > > would be useful to contrast the code with simpler alternatives like PEP > > 288's g.throw() or PEP 325's g.close(). On the plus side, the > > block-iterator approach factors out code common to multiple callers. On > > the minus side, the other PEPs involve simpler mechanisms and their > > learning curve would be nearly zero. These pluses and minuses are > > important because apply equally to all examples using blocks for > > initialization/finalization. > > Where do you see a learning curve for blocks? Altering the meaning of a for-loop; introducing a new keyword; extending the semantics of "break" and "continue"; allowing try/finally inside a generator; introducing new control flow; adding new magic methods __next__ and __exit__; adding a new context for "as"; and tranforming "yield" from statement semantics to expression semantics. This isn't a lightweight proposal and not one where we get transference of knowledge from other languages (except for a few users of Ruby, Smalltalk, etc). By comparision, g.throw() or g.close() are trivially simple approaches to generator/iterator finalization. In section on new for-loop specification, what is the purpose of "arg"? Can it be replaced with the constant None? itr = iter(EXPR1) brk = False while True: try: VAR1 = next(itr, None) except StopIteration: brk = True break BLOCK1 if brk: BLOCK2 In "block expr as var", can "var" be any lvalue? block context() as inputfil, outputfil, errorfil: for i, line in enumerate(inputfil): if not checkformat(line): print >> errorfil, line else: print >> outputfil, secret_recipe(line) In re-reading the examples, it occurred to me that the word "block" already has meaning in the context of threading.Lock.acquire() which has an optional blocking argument defaulting to 1. In example 4, consider adding a comment that the "continue" has its normal (non-extending) meaning. The examples should demonstrate the operation of the extended form of "continue", "break", and "return" in the body of the block. Raymond ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340: Breaking out.
Guido van Rossum wrote: > [Skip Montanaro] >> To the casual observer, this >> looks like "break" should break out of the loop: >> >> while True: >> block synchronized(v1): >> ... >> if v1.field: >> break >> time.sleep(1) > > Without 'block' this would be written as try/finally. And my point is > that people just don't write try/finally inside a while loop very > often (I found *no* examples in the entire standard library). Errr... Dutch example: Dining Philosophers (Dijkstra) --eric ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340: Only for try/finally?
> > Guido> How many try/finally statements have you written inside a loop? > > Guido> In my experience this is extrmely rare. I found no > > Guido> occurrences in the standard library. > > [Skip again] > > How'd we start talking about try/finally? > > Because it provides by far the dominant use case for 'block'. The > block-statement is intended to replace many boilerplace uses of > try/finally. In addition, it's also a coroutine invocation primitive. Maybe I'm not understanding something, but why should "block" only be for less boilerplate in try/finally's? I spent an hour grepping through the standard library and there are indeed lots of use cases for some blocks to replace try/finallys. There are opportunities for block opening(file) and block locked(mutex) everywhere! But why stop there? Lots of functions that takes a callable as argument could be upgraded to use the new block syntax. Because it is a cool way to do template method, isn't it? Take wrapper() in curses/wrapper.py for example. Why have it like this: wrapper(curses_wrapped_main) when you can have it like this: .block wrapper(): .(main program stuff) .(...) Or assertRaises in unittest.py, why call it like this: self.assertRaises(TypeError, lambda: a*x) When you can squash the lambda like this: .block self.assertRaises(TypeError): .a*x Or for another use case, in gl-code you often write glBegin().. glDrawBlah().. glEnd(). Make it properly indented!: .block glNowDraw():# glBegin(); yield; glEnd() .glDrawBlah() Make your own repeat-until loop: .def until(cond): .while True: .yield None .if cond: .break .block until(lambda: s == "quit"): .s = sys.stdin.readline() It seems like the possibilities are endless. Maybe too endless? Because this new feature is so similar to anonymous functions, but is not quite anonymous functions, so why not introduce anonymous functions instead, that could make all the things block can, and more? But as I said, I'm misunderstanding something. -- mvh Björn ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
At 05:30 PM 5/3/05 -0400, Raymond Hettinger wrote: >By comparision, g.throw() or g.close() are trivially simple approaches >to generator/iterator finalization. That reminds me of something; in PEP 333 I proposed use of a 'close()' attribute in anticipation of PEP 325, so that web applications implemented as generators could take advantage of resource cleanup. Is there any chance that as part of PEP 340, 'close()' could translate to the same as '__exit__(StopIteration)'? If not, modifying PEP 333 to support '__exit__' is going to be a bit of a pain, especially since there's code in the field now with that assumption. ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
[Guido] > > Where do you see a learning curve for blocks? [Raymond] > Altering the meaning of a for-loop; introducing a new keyword; extending > the semantics of "break" and "continue"; allowing try/finally inside a > generator; introducing new control flow; adding new magic methods > __next__ and __exit__; adding a new context for "as"; and tranforming > "yield" from statement semantics to expression semantics. This isn't a > lightweight proposal and not one where we get transference of knowledge > from other languages (except for a few users of Ruby, Smalltalk, etc). [Bah, gmail just lost my draft. :-( Trying to reconstruct...] But there are several separable proposals in the PEP. Using "continue EXPR" which calls its.__next__(EXPR) which becomes the return value of a yield-expression is entirely orthogonal (and come to think of it the PEP needs a motivating example for this). And come to think of it, using a generator to "drive" a block statement is also separable; with just the definition of the block statement from the PEP you could implement all the examples using a class (similar to example 6, which is easily turned into a template). I think that seeing just two of the examples would be enough for most people to figure out how to write their own, so that's not much of a learning curve IMO. > By comparision, g.throw() or g.close() are trivially simple approaches > to generator/iterator finalization. But much more clumsy to use since you have to write your own try/finally. > In section on new for-loop specification, what is the purpose of "arg"? > Can it be replaced with the constant None? No, it is set by the "continue EXPR" translation given just below it. I'll add a comment; other people also missed this. > In "block expr as var", can "var" be any lvalue? Yes. That's what I meant by "VAR1 is an arbitrary assignment target (which may be a comma-separated list)". I'm adding an example that shows this usage. > In re-reading the examples, it occurred to me that the word "block" > already has meaning in the context of threading.Lock.acquire() which has > an optional blocking argument defaulting to 1. Yeah, Holger also pointed out that block is a common variable name... :-( > In example 4, consider adding a comment that the "continue" has its > normal (non-extending) meaning. I'd rather not, since this would just increase the confusion between the body of the generator (where yield has a special meaning) vs. the body of the block-statement (where continue, break, return and exceptions have a special meaning). Also note example 5, which has a yield inside a block-statement. This is the block statement's equivalent to using a for-loop with a yield in its body in a regular generator when it is invoking another iterator or generator recursively. > The examples should demonstrate the operation of the extended form of > "continue", "break", and "return" in the body of the block. Good point. (Although break and return don't really have an extended form -- they just get new semantics in a block-statement.) I'll have to think about those. -- --Guido van Rossum (home page: http://www.python.org/~guido/) ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340: Only for try/finally?
At 11:54 PM 5/3/05 +0200, BJörn Lindqvist wrote: >It seems like the possibilities are endless. Maybe too endless? >Because this new feature is so similar to anonymous functions, but is >not quite anonymous functions, so why not introduce anonymous >functions instead, that could make all the things block can, and more? >But as I said, I'm misunderstanding something. Anonymous functions can't rebind variables in their enclosing function. It could be argued that it's better to fix this, rather than inventing a new macro-like facility, but I don't know how such a rebinding facility could preserve readability as well as PEP 340 does. Also, many of your examples are indeed improvements over calling a function that takes a function. The block syntax provides a guarantee that the block will be executed immediately or not at all. Once you are past the block suite in the code, you know it will not be re-executed, because no reference to it is ever held by the called function. You do not have this same guarantee when you see a function-taking-function being invoked. So, a block suite tells you that the control flow is more-or-less linear, whereas a function definition raises the question of *when* that function will be executed, and whether you have exhaustive knowledge of the possible places from which it may be called. ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340: Breaking out.
Guido van Rossum wrote: > [Skip Montanaro] >>Guido> How many try/finally statements have you written inside a loop? >>Guido> In my experience this is extrmely rare. I found no >>Guido> occurrences in the standard library. > >>How'd we start talking about try/finally? > > Because it provides by far the dominant use case for 'block'. The > block-statement is intended to replace many boilerplace uses of > try/finally. In addition, it's also a coroutine invocation primitive. I would expect programmers to do more than only replace existing try/finally blocks. The support for RAII patterns in Python might result in more use of RAII primitives and some may fit very well inside a loop. It might not be a bad idea to look at what other languages are doing with RAII. Also, even if there's no occurence right now in the standard library, it doesn't mean it has always been the case in the code evolution, where debugging such pitfall would not be cool. FWIW, I expect most generators used in block-syntax to not be loops. What would imply to support these to pass "break" to parent loop at run-time? Maybe generators are not the way to go, but could be supported natively by providing a __block__ function, very similarly to sequences providing an __iter__ function for for-loops? We could avoid explaining to a newbie why the following code doesn't work if "opening" could be implemented in way that it works. for filename in filenames: block opening(filename) as file: if someReason: break By the way, FWIW, my preference if to have no keyword, making it clearer that some block statements are loops and others not, but probably amplifying the "break" problem. Regards, Nicolas ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
[Phillip] > That reminds me of something; in PEP 333 I proposed use of a 'close()' > attribute in anticipation of PEP 325, so that web applications implemented > as generators could take advantage of resource cleanup. Is there any > chance that as part of PEP 340, 'close()' could translate to the same as > '__exit__(StopIteration)'? If not, modifying PEP 333 to support '__exit__' > is going to be a bit of a pain, especially since there's code in the field > now with that assumption. Maybe if you drop support for the "separate protocol" alternative... :-) I had never heard of that PEP. How much code is there in the field? Written by whom? I suppose you can always write a decorator that takes care of the mapping. I suppose it should catch and ignore the StopIteration that __exit__(StopIteration) is likely to throw. -- --Guido van Rossum (home page: http://www.python.org/~guido/) ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Py_UNICODE madness
The documentation for Py_UNICODE states the following: "This type represents a 16-bit unsigned storage type which is used by Python internally as basis for holding Unicode ordinals. On platforms where wchar_t is available and also has 16-bits, Py_UNICODE is a typedef alias for wchar_t to enhance native platform compatibility. On all other platforms, Py_UNICODE is a typedef alias for unsigned short." However, we have found this not to be true on at least certain RedHat versions (maybe all, but I'm not willing to say that at this point). pyconfig.h on these systems reports that PY_UNICODE_TYPE is wchar_t, and PY_UNICODE_SIZE is 4. Needless to say, this isn't consistent with the docs. It also creates quite a few problems when attempting to interface Python with other libraries which produce unicode data. Is this a bug, or is this behaviour intended? It turns out that at some point in the past, this created problems for tkinter as well, so someone just changed the internal unicode representation in tkinter to be 4 bytes as well, rather than tracking down the real source of the problem. Is PY_UNICODE_TYPE always going to be guaranteed to be 16 bits, or is it dependent on your platform? (in which case we can give up now on Python unicode compatibility with any other libraries). At the very least, if we can't guarantee the internal representation, then the PyUnicode_FromUnicode API needs to go away, and be replaced with something capable of transcoding various unicode inputs into the internal python representation. -- Nick ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340: Breaking out.
> FWIW, I expect most generators used in block-syntax to not be loops. > What would imply to support these to pass "break" to parent loop at > run-time? I proposed this at some point during the discussion leading up to the PEP and it was boohed away as too fragile (and I agree). You're just going to have to learn to deal with it, just as you can't break out of two nested loops (but you can return from the innermost loop). > Maybe generators are not the way to go, but could be > supported natively by providing a __block__ function, very similarly to > sequences providing an __iter__ function for for-loops? Sorry, I have no idea what you are proposing here. -- --Guido van Rossum (home page: http://www.python.org/~guido/) ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Py_UNICODE madness
I think that documentation is wrong; AFAIK Py_UNICODE has always been allowed to be either 16 or 32 bits, and the source code goes through great lengths to make sure that you get a link error if you try to combine extensions built with different assumptions about its size. On 5/3/05, Nicholas Bastin <[EMAIL PROTECTED]> wrote: > The documentation for Py_UNICODE states the following: > > "This type represents a 16-bit unsigned storage type which is used by > Python internally as basis for holding Unicode ordinals. On platforms > where wchar_t is available and also has 16-bits, Py_UNICODE is a > typedef alias for wchar_t to enhance native platform compatibility. On > all other platforms, Py_UNICODE is a typedef alias for unsigned > short." > > However, we have found this not to be true on at least certain RedHat > versions (maybe all, but I'm not willing to say that at this point). > pyconfig.h on these systems reports that PY_UNICODE_TYPE is wchar_t, > and PY_UNICODE_SIZE is 4. Needless to say, this isn't consistent with > the docs. It also creates quite a few problems when attempting to > interface Python with other libraries which produce unicode data. > > Is this a bug, or is this behaviour intended? > > It turns out that at some point in the past, this created problems for > tkinter as well, so someone just changed the internal unicode > representation in tkinter to be 4 bytes as well, rather than tracking > down the real source of the problem. > > Is PY_UNICODE_TYPE always going to be guaranteed to be 16 bits, or is > it dependent on your platform? (in which case we can give up now on > Python unicode compatibility with any other libraries). At the very > least, if we can't guarantee the internal representation, then the > PyUnicode_FromUnicode API needs to go away, and be replaced with > something capable of transcoding various unicode inputs into the > internal python representation. > > -- > Nick > > ___ > Python-Dev mailing list > [email protected] > http://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > http://mail.python.org/mailman/options/python-dev/guido%40python.org > -- --Guido van Rossum (home page: http://www.python.org/~guido/) ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
Summary: Resource Managers are a good idea. First Class Suites may be a good idea. Block Iterators try to split the difference. They're not as powerful as First Class Suites, and not as straightforward as Resource Managers. This particular middle ground didn't work out so well. On 5/3/05, Guido van Rossum <[EMAIL PROTECTED]> wrote: > [Jim Jewett] ... > > With the block, every yield returns to a single designated callback. > > This callback had to be established at the same time the block was > > created, and must be textually inside it. (An indented suite to the > > "block XXX:" line.) > Doesn't convince me. The common use for a regular generator is in a > for-loop, where every yield also returns to a single designated place > (calling it callback is really deceptive!). I do not consider the body of a for-loop a to be callback; the generator has no knowledge of that body. But with a Block Iterator, the generator (or rather, its unrolled version) does need to textually contain the to-be-included suite -- which is why that suite smells like a callback function that just doesn't happen to be named. > And with a block, you're free to put the generator call ahead of the > block so you can call next() on it manually: > > it = EXPR1 > block it: > BLOCK1 > ... lets you call next() on it as you please (until the > block is exited, for sure). For a Resource Manager, the only thing this could do is effectively discard the BLOCK1, because the yields would have been used up (and the resource deallocated). I suppose this is another spelling of "resources are not loops". > > But are there plenty of other use cases for PEP 340? > Yes. Patterns like "do this little dance in a try/finally block" and > "perform this tune when you catch an XYZ exception" are pretty common ... Let me rephrase ... The Block Iterator syntax gets awkward if it needs to yield more than once (and the exits are not interchangable). You have said that is OK because most Resource Managers only yield once. But if you're willing to accept that, then why not just limit it to a Resource Manager instead of an Iterator? Resource Managers could look similar to the current proposal, but would be less ambitious. They should have absolutely no connection to loops/iterators/generators. There should be no internal secret loop. if they use the "yield" keyword, it should be described as "yielding control" rather than "yielding the next value." There would be only one yielding of control per Resource Manager. If limiting the concept to Resource Managers is not acceptable, then I still don't think Block Iterators are the right answer -- though First Class Suites might be. (And so might "No Changes at all".) Reasoning: If there is only one yield, then you're really just wrapping the call to the (unnamed) suite. (Q)Why are decorators not appropriate? (A1) In some cases, the wrapper needs to capture an instance-variable, which isn't available at definition-time. (A2) Decorators can be ugly. This is often because the need to return a complete replacement callable leads to too many nested functions. These are both problems with decorators. They do argue for improving the decorator syntax, but not for throwing out the concept. I don't think that Block Iterators will really clear things up -- to me, they just look like a different variety of fog. If decoration doesn't work, why not use a regular function that takes a callback? Pass the callback instead of defining an anonymous suite. Call the callback instead of writing the single yield. ... > ... you are proposing to solve all its use cases by defining an > explicit function or method representing the body of the block. Yes. > The latter solution leads to way too much ugly code -- all that > function-definition boilerplate is worse than the try/finally > boilerplate we're trying to hide! In the cases I've actually seen, the ugly function definition portions are in the decorator, rather than the regular function. It trades a little ugliness that gets repeated all over the place for a lot of ugliness that happens only once (in the decorator). That said, I'm willing to believe that breaking out a method might sometimes be a bad idea. In which case you probably want First Class (and decorable) Suites. If First Class Suites are not acceptable in general, then let's figure out where they are acceptable. For me, Resource Manager is a good use case, but Block Iterator is not. -jJ ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
Sorry Jim, but I just don't think you & I were intended to be on the same language design committee. Nothing you say seems to be making any sense to me these days. Maybe someone else can channel you effectively, but I'm not going to try to do a line-by-line response to your email quoted below. On 5/3/05, Jim Jewett <[EMAIL PROTECTED]> wrote: > Summary: > > Resource Managers are a good idea. > First Class Suites may be a good idea. > > Block Iterators try to split the difference. They're not as powerful > as First Class Suites, and not as straightforward as Resource > Managers. This particular middle ground didn't work out so well. > > On 5/3/05, Guido van Rossum <[EMAIL PROTECTED]> wrote: > > [Jim Jewett] > ... > > > With the block, every yield returns to a single designated callback. > > > This callback had to be established at the same time the block was > > > created, and must be textually inside it. (An indented suite to the > > > "block XXX:" line.) > > > Doesn't convince me. The common use for a regular generator is in a > > for-loop, where every yield also returns to a single designated place > > (calling it callback is really deceptive!). > > I do not consider the body of a for-loop a to be callback; the generator > has no knowledge of that body. > > But with a Block Iterator, the generator (or rather, its unrolled version) > does need to textually contain the to-be-included suite -- which is why > that suite smells like a callback function that just doesn't happen to be > named. > > > And with a block, you're free to put the generator call ahead of the > > block so you can call next() on it manually: > > > > it = EXPR1 > > block it: > > BLOCK1 > > > ... lets you call next() on it as you please (until the > > block is exited, for sure). > > For a Resource Manager, the only thing this could do is effectively > discard the BLOCK1, because the yields would have been used > up (and the resource deallocated). > > I suppose this is another spelling of "resources are not loops". > > > > But are there plenty of other use cases for PEP 340? > > > Yes. Patterns like "do this little dance in a try/finally block" and > > "perform this tune when you catch an XYZ exception" are pretty common > > ... > > Let me rephrase ... > > The Block Iterator syntax gets awkward if it needs to yield more than > once (and the exits are not interchangable). You have said that is OK > because most Resource Managers only yield once. > > But if you're willing to accept that, then why not just limit it to a Resource > Manager instead of an Iterator? Resource Managers could look similar > to the current proposal, but would be less ambitious. They should have > absolutely no connection to loops/iterators/generators. There should be > no internal secret loop. if they use the "yield" keyword, it should be > described as "yielding control" rather than "yielding the next value." There > would be only one yielding of control per Resource Manager. > > If limiting the concept to Resource Managers is not acceptable, then > I still don't think Block Iterators are the right answer -- though First Class > Suites might be. (And so might "No Changes at all".) > > Reasoning: > > If there is only one yield, then you're really just wrapping the call to > the (unnamed) suite. > > (Q)Why are decorators not appropriate? > > (A1) In some cases, the wrapper needs to capture an > instance-variable, which isn't available at definition-time. > (A2) Decorators can be ugly. This is often because the > need to return a complete replacement callable leads to too > many nested functions. > > These are both problems with decorators. They do argue for > improving the decorator syntax, but not for throwing out the > concept. I don't think that Block Iterators will really clear things > up -- to me, they just look like a different variety of fog. > > If decoration doesn't work, why not use a regular function > that takes a callback? Pass the callback instead of defining an > anonymous suite. Call the callback instead of writing the single > yield. > > ... > > > ... you are proposing to solve all its use cases by defining an > > explicit function or method representing the body of the block. > > Yes. > > > The latter solution leads to way too much ugly code -- all that > > function-definition boilerplate is worse than the try/finally > > boilerplate we're trying to hide! > > In the cases I've actually seen, the ugly function definition portions > are in the decorator, rather than the regular function. It trades a > little ugliness that gets repeated all over the place for a lot of ugliness > that happens only once (in the decorator). > > That said, I'm willing to believe that breaking out a method might > sometimes be a bad idea. In which case you probably want > First Class (and decorable) Suites. > > If First Class Suites are not acceptable in general, then let's figure > out where they are acc
Re: [Python-Dev] PEP 340 -- concept clarification
At 03:33 PM 5/3/05 -0700, Guido van Rossum wrote: >[Phillip] > > That reminds me of something; in PEP 333 I proposed use of a 'close()' > > attribute in anticipation of PEP 325, so that web applications implemented > > as generators could take advantage of resource cleanup. Is there any > > chance that as part of PEP 340, 'close()' could translate to the same as > > '__exit__(StopIteration)'? If not, modifying PEP 333 to support '__exit__' > > is going to be a bit of a pain, especially since there's code in the field > > now with that assumption. > >Maybe if you drop support for the "separate protocol" alternative... :-) I don't understand you. Are you suggesting a horse trade, or...? >I had never heard of that PEP. How much code is there in the field? Maybe a dozen or so web applications and frameworks (including Zope, Quixote, PyBlosxom) and maybe a half dozen servers (incl. Twisted and mod_python). A lot of the servers are based on my wsgiref library, though, so it probably wouldn't be too horrible a job to make everybody add support; I might even be able to fudge wsgiref so that wsgiref-based servers don't even see an issue. Modifying the spec is potentially more controversial, however; it'll have to go past the Web-SIG, and I assume the first thing that'll be asked is, "Why aren't generators getting a close() method then?", so I figured I should ask that question first. I'd completely forgotten about this being an issue until Raymond mentioned g.close(); I'd previously gotten the impression that PEP 325 was expected to be approved, otherwise I wouldn't have written support for it into PEP 333. >Written by whom? I used to know who all had written implementations, but there are now too many to keep track of. ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
At 07:27 PM 5/3/05 -0400, Phillip J. Eby wrote: >Modifying the spec is potentially more controversial, however; it'll have >to go past the Web-SIG, and I assume the first thing that'll be asked is, >"Why aren't generators getting a close() method then?", so I figured I >should ask that question first. You know what, never mind. I'm still going to write the Web-SIG so they know the change is coming, but this is really a very minor thing; just a feature we won't get "for free" as a side effect of PEP 325. Your decorator idea is a trivial solution, but it would also be trivial to allow WSGI server implementations to call __exit__ on generators. None of this affects existing code in the field, because today you can't write a try/finally in a generator anyway. Therefore, nobody is relying on this feature, therefore it's basically moot. ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
> >Maybe if you drop support for the "separate protocol" alternative... :-)
>
> I don't understand you. Are you suggesting a horse trade, or...?
Only tongue-in-cheek. :-)
> >I had never heard of that PEP. How much code is there in the field?
>
> Maybe a dozen or so web applications and frameworks (including Zope,
> Quixote, PyBlosxom) and maybe a half dozen servers (incl. Twisted and
> mod_python). A lot of the servers are based on my wsgiref library, though,
> so it probably wouldn't be too horrible a job to make everybody add
> support; I might even be able to fudge wsgiref so that wsgiref-based
> servers don't even see an issue.
>
> Modifying the spec is potentially more controversial, however; it'll have
> to go past the Web-SIG, and I assume the first thing that'll be asked is,
> "Why aren't generators getting a close() method then?", so I figured I
> should ask that question first.
>
> I'd completely forgotten about this being an issue until Raymond mentioned
> g.close(); I'd previously gotten the impression that PEP 325 was expected
> to be approved, otherwise I wouldn't have written support for it into PEP 333.
>
> >Written by whom?
>
> I used to know who all had written implementations, but there are now too
> many to keep track of.
Given all that, it's not infeasible to add a close() method to
generators as a shortcut for this:
def close(self):
try:
self.__exit__(StopIteration)
except StopIteration:
break
else:
# __exit__() didn't
raise RuntimeError("or some other exception")
I'd like the block statement to be defined exclusively in terms of
__exit__() though.
--
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- loose ends
Another loose end (which can partially explain why I still thought __next__ took an exception ;) In "Specification: Generator Exit Handling":: "When __next__() is called with an argument that is not None, the yield-expression that it resumes will return the value attribute of the argument." I think this should read:: "When __next__() is called with an argument that is not None, the yield-expression that it resumes will return the argument." Tim Delaney ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Need to hook Py_FatalError
On Tue, 2005-05-03 at 12:54 -0500, Jeff Epler wrote: > On Tue, May 03, 2005 at 09:15:42AM -0700, Guido van Rossum wrote: > > But tell me, what do you want the process to do instead of > > terminating? Py_FatalError is used in situations where raising an > > exception is impossible or would do more harm than good. > > In an application which embeds Python, I want to show the application's > standard error dialog, which doesn't call any Python APIs (but does do > things like capture the call stack at the time of the error). For this > use, it doesn't matter that no further calls to those APIs are possible. > > Jeff +1 Here. In my case(postgresql), it would probably be wiser to map Py_Fatal's to Postgres' ereport(FATAL,(...)), as it does appear to do some cleaning up on exit, and if there's a remote user, it could actually give the user the message. [http://python.project.postgresql.org] -- Regards, James William Pye ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
> But there are several separable proposals in the PEP. Using "continue > EXPR" which calls its.__next__(EXPR) which becomes the return value of > a yield-expression is entirely orthogonal (and come to think of it the > PEP needs a motivating example for this). > > And come to think of it, using a generator to "drive" a block > statement is also separable; with just the definition of the block > statement from the PEP you could implement all the examples using a > class (similar to example 6, which is easily turned into a template). I think that realization is important. It would be great to have a section of the PEP that focuses on separability and matching features to benefits. Start with above observation that the proposed examples can be achieved with generators driving the block statement. When the discussion hits comp.lang.python, a separability section will help focus the conversation (there's a flaw/issue/dislike about feature x; however, features y/z and related benefits do not depend on x). Essentially, having generators as block drivers is the base proposal. Everything else is an elaboration. > > By comparision, g.throw() or g.close() are trivially simple approaches > > to generator/iterator finalization. > > But much more clumsy to use since you have to write your own try/finally. Sometimes easy makes up for clumsy. > > In re-reading the examples, it occurred to me that the word "block" > > already has meaning in the context of threading.Lock.acquire() which has > > an optional blocking argument defaulting to 1. > > Yeah, Holger also pointed out that block is a common variable name... :-( Someone mentioned "suite" as a suitable alternative. That word seems to encompass the same conceptual space without encroaching on existing variable and argument names. Also, "suite" reads as a noun. In contrast, "block" has a verb form that too easily misconnects with the name of the block-iterator expression -- what comes to mind when you see block sender() or block next_message(). Performance-wise, I cringe at the thought of adding any weight at all to the for-loop semantics. The current version is super lightweight and clean. Adding anything to it will likely have a comparatively strong negative effect on timings. It's too early for that discussion, but keep it in mind. That's pretty much it for my first readings of the PEP. All-in-all it has come together nicely. Raymond ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
> it's not infeasible to add a close() method to
> generators as a shortcut for this:
>
> def close(self):
> try:
> self.__exit__(StopIteration)
> except StopIteration:
> break
> else:
> # __exit__() didn't
> raise RuntimeError("or some other exception")
>
> I'd like the block statement to be defined exclusively in terms of
> __exit__() though.
That sounds like a winner.
Raymond
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
At 04:41 PM 5/3/05 -0700, Guido van Rossum wrote:
>Given all that, it's not infeasible to add a close() method to
>generators as a shortcut for this:
>
> def close(self):
> try:
> self.__exit__(StopIteration)
> except StopIteration:
> break
> else:
> # __exit__() didn't
> raise RuntimeError("or some other exception")
>
>I'd like the block statement to be defined exclusively in terms of
>__exit__() though.
Sure. PEP 325 proposes a "CloseGenerator" exception in place of
"StopIteration", however, because:
"""
Issues: should StopIteration be reused for this purpose? Probably
not. We would like close to be a harmless operation for legacy
generators, which could contain code catching StopIteration to
deal with other generators/iterators.
"""
I don't know enough about the issue to offer either support or opposition
for this idea, though.
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
[Guido] > > And come to think of it, using a generator to "drive" a block > > statement is also separable; with just the definition of the block > > statement from the PEP you could implement all the examples using a > > class (similar to example 6, which is easily turned into a template). [Raymond Hettinger] > I think that realization is important. It would be great to have a > section of the PEP that focuses on separability and matching features to > benefits. Start with above observation that the proposed examples can > be achieved with generators driving the block statement. Good idea. I'm kind of stuck for time (have used up most of my Python time for the next few weeks) -- if you or someone else could volunteer some text I'd appreciate it. > When the discussion hits comp.lang.python, a separability section will > help focus the conversation (there's a flaw/issue/dislike about feature > x; however, features y/z and related benefits do not depend on x). Right. The PEP started with me not worrying too much about motivation or use cases but instead focusing on precise specification of the mechanisms, since there was a lot of confusion over that. Now that's out of the way, motivation (you might call it "spin" :-) becomes more important. > Essentially, having generators as block drivers is the base proposal. > Everything else is an elaboration. Right. > Someone mentioned "suite" as a suitable alternative. That word seems to > encompass the same conceptual space without encroaching on existing > variable and argument names. Alas, the word "suite" is used extensively when describing Python's syntax. > Performance-wise, I cringe at the thought of adding any weight at all to > the for-loop semantics. The current version is super lightweight and > clean. Adding anything to it will likely have a comparatively strong > negative effect on timings. It's too early for that discussion, but > keep it in mind. A for-loop without a "continue EXPR" in it shouldn't need to change at all; the tp_iternext slot could be filled with either __next__ or next whichever is defined. -- --Guido van Rossum (home page: http://www.python.org/~guido/) ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
On 5/3/05, Phillip J. Eby <[EMAIL PROTECTED]> wrote:
> At 04:41 PM 5/3/05 -0700, Guido van Rossum wrote:
> >Given all that, it's not infeasible to add a close() method to
> >generators as a shortcut for this:
> >
> > def close(self):
> > try:
> > self.__exit__(StopIteration)
> > except StopIteration:
> > break
> > else:
> > # __exit__() didn't
> > raise RuntimeError("or some other exception")
> >
> >I'd like the block statement to be defined exclusively in terms of
> >__exit__() though.
(So do you want this feature now or not? Earlier you said it was no big deal.)
> Sure. PEP 325 proposes a "CloseGenerator" exception in place of
> "StopIteration", however, because:
>
> """
> Issues: should StopIteration be reused for this purpose? Probably
> not. We would like close to be a harmless operation for legacy
> generators, which could contain code catching StopIteration to
> deal with other generators/iterators.
> """
>
> I don't know enough about the issue to offer either support or opposition
> for this idea, though.
That would be an issue for the generator finalization proposed by the
PEP as well.
But I kind of doubt that it's an issue; you'd have to have a
try/except catching StopIteration around a yield statement that
resumes the generator before this becomes an issue, and that sounds
extremely improbable. If at all possible I'd rather not have to define
a new exception for this purpose.
--
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
Guido van Rossum wrote: > I'd like the block statement to be defined exclusively in terms of > __exit__() though. This does actually suggest something to me (note - just a thought - no real idea if it's got any merit). Are there any use cases proposed for the block-statement (excluding the for-loop) that do *not* involve resource cleanup (i.e. need an __exit__)? This could be the distinguishing feature between for-loops and block-statements: 1. If an iterator declares __exit__, it cannot be used in a for-loop. For-loops do not guarantee resource cleanup. 2. If an iterator does not declare __exit__, it cannot be used in a block-statement. Block-statements guarantee resource cleanup. This gives separation of API (and thus purpose) whilst maintaining the simplicity of the concept. Unfortunately, generators then become a pain :( We would need additional syntax to declare that a generator was a block generator. OTOH, this may not be such a problem. Any generator that contains a finally: around a yield automatically gets an __exit__, and any that doesn't, doesn't. Although that feels *way* too magical to me (esp. in light of my example below, which *doesn't* use finally). I'd prefer a separate keyword for block generators. In that case, having finally: around a yield would be a syntax error in a "normal" generator. :: resource locking(lock): lock.acquire() try: yield finally: lock.release() block locking(myLock): # Code here executes with myLock held. The lock is # guaranteed to be released when the block is left (even # if via return or by an uncaught exception). To use a (modified) example from another email:: class TestCase: resource assertRaises (self, excClass): try: yield except excClass: return else: if hasattr(excClass, '__name__'): excName = excClass.__name__ else: excName = str(excClass) raise self.failureException, "%s is not raised" % excName block self.assertRaises(TypeError): raise TypeError Note that this *does* require cleanup, but without using a finally: clause - the except: and else: are the cleanup code. Tim Delaney ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
> > I think that realization is important. It would be great to have a > > section of the PEP that focuses on separability and matching features to > > benefits. Start with above observation that the proposed examples can > > be achieved with generators driving the block statement. > > Good idea. I'm kind of stuck for time (have used up most of my Python > time for the next few weeks) -- if you or someone else could volunteer > some text I'd appreciate it. I'll take a crack at it in the morning (we all seem to be on borrowed time this week). > > When the discussion hits comp.lang.python, a separability section will > > help focus the conversation (there's a flaw/issue/dislike about feature > > x; however, features y/z and related benefits do not depend on x). > > Right. The PEP started with me not worrying too much about motivation > or use cases but instead focusing on precise specification of the > mechanisms, since there was a lot of confusion over that. Now that's > out of the way, motivation (you might call it "spin" :-) becomes more > important. Perhaps the cover announcement should impart the initial spin as a request for the community to create, explore, and learn from use cases. That will help make the discussion more constructive, less abstract, and more grounded in reality (wishful thinking). That probably beats, "Here's 3500 words of proposal; do you like it?". Raymond ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
At 05:17 PM 5/3/05 -0700, Guido van Rossum wrote: >(So do you want this feature now or not? Earlier you said it was no big deal.) It *isn't* a big deal; but it'd still be nice, and I'd happily volunteer to do the actual implementation of the 'close()' method myself, because it's about the same amount of work as updating PEP 333 and sorting out any political issues that might arise therefrom. :) >But I kind of doubt that it's an issue; you'd have to have a >try/except catching StopIteration around a yield statement that >resumes the generator before this becomes an issue, and that sounds >extremely improbable. But it does exist, alas; see the 'itergroup()' and 'xmap()' functions of this cookbook recipe: http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/66448/ Or more pointedly, the 'roundrobin()' example in the Python 2.4 documentation: http://www.python.org/doc/lib/deque-recipes.html And there are other examples as well: http://www.faqts.com/knowledge_base/view.phtml/aid/13516 http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/141934 ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
At 08:47 PM 5/3/05 -0400, Phillip J. Eby wrote: > http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/141934 Oops; that one's not really a valid example; the except StopIteration just has a harmless "pass", and it's not in a loop. ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
> >(So do you want this feature now or not? Earlier you said it was no big > deal.) > > It *isn't* a big deal; but it'd still be nice, and I'd happily volunteer > to > do the actual implementation of the 'close()' method myself, because it's > about the same amount of work as updating PEP 333 and sorting out any > political issues that might arise therefrom. :) Can I recommend tabling this one for the time being. My sense is that it can be accepted independently of PEP 340 but that it should wait until afterwards because the obvious right-thing-to-do will be influenced by what happens with 340. Everyone's bandwidth is being maxed-out at this stage. So it is somewhat helpful to keep focused on the core proposal of generator driven block/suite thingies. Raymond ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Need to hook Py_FatalError
On Tue, 2005-05-03 at 13:39 -0700, Josiah Carlson wrote: > If I'm wrong, I'd like to hear it, but I'm still waiting for your patch > on sourceforge. Well, if he lost/loses interest for whatever reason, I'd be willing to provide. Although, if m.u.k. is going to write it, please be sure to include a CPP macro/define, so that embedders could recognize the feature without having to run explicit checks or do version based fingerprinting. (I'd be interested to follow the patch if you(muk) put it up!) Hrm, although, I don't think it would be wise to allow extension modules to set this. IMO, there should be some attempt to protect it; ie, once it's initialized, don't allow reinitialization, as if the embedder is handling it, it should be handled through the duration of the process. So, a static function pointer in pythonrun.c initialized to NULL, a protective setter that will only allow setting if the pointer is NULL, and Py_FatalError calling the pointer if pointer != Py_FatalError. Should [Py_FatalError] fall through if the hook didn't terminate the process to provide some level of warranty that the process will indeed die? Sound good? -- Regards, James William Pye ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 340 -- concept clarification
Delaney, Timothy C (Timothy) wrote: > Guido van Rossum wrote: > >> I'd like the block statement to be defined exclusively in terms of >> __exit__() though. > > 1. If an iterator declares __exit__, it cannot be used in a for-loop. >For-loops do not guarantee resource cleanup. > > 2. If an iterator does not declare __exit__, it cannot be used in a > block-statement. >Block-statements guarantee resource cleanup. Now some thoughts have solidified in my mind ... I'd like to define some terminology that may be useful. resource protocol: __next__ __exit__ Note: __iter__ is explicitly *not* required. resource: An object that conforms to the resource protocol. resource generator: A generator function that produces a resource. resource usage statement/suite: A suite that uses a resource. With this conceptual framework, I think the following makes sense: - Keyword 'resource' for defining a resource generator. - Keyword 'use' for using a resource. e.g. :: resource locker (lock): lock.acquire() try: yield finally: lock.release() use locker(lock): # do stuff Tim Delaney ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Py_UNICODE madness
On May 3, 2005, at 6:44 PM, Guido van Rossum wrote: > I think that documentation is wrong; AFAIK Py_UNICODE has always been > allowed to be either 16 or 32 bits, and the source code goes through > great lengths to make sure that you get a link error if you try to > combine extensions built with different assumptions about its size. That makes PyUnicode_FromUnicode() a lot less useful. Well, really, not useful at all. You might suggest that PyUnicode_FromWideChar is more useful, but that's only true on platforms that support wchar_t. Is there no universally supported way of moving buffers of unicode data (as common data types, like unsigned short, etc.) into Python from C? -- Nick ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Py_UNICODE madness
I really don't know. Effbot, MvL and/or MAL should know. On 5/3/05, Nicholas Bastin <[EMAIL PROTECTED]> wrote: > > On May 3, 2005, at 6:44 PM, Guido van Rossum wrote: > > > I think that documentation is wrong; AFAIK Py_UNICODE has always been > > allowed to be either 16 or 32 bits, and the source code goes through > > great lengths to make sure that you get a link error if you try to > > combine extensions built with different assumptions about its size. > > That makes PyUnicode_FromUnicode() a lot less useful. Well, really, > not useful at all. > > You might suggest that PyUnicode_FromWideChar is more useful, but > that's only true on platforms that support wchar_t. > > Is there no universally supported way of moving buffers of unicode data > (as common data types, like unsigned short, etc.) into Python from C? > > -- > Nick > > -- --Guido van Rossum (home page: http://www.python.org/~guido/) ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] 2 words keyword for block
Gheorghe Milas wrote: > in template thread_safe(lock): > in template redirected_stdout(stream): > in template use_and_close_file(path) as file: > in template as_transaction(): > in template auto_retry(times=3, failas=IOError): -1. This is unpythonically verbose. If I wanted to get lots of finger exercise typing redundant keywords, I'd program in COBOL. :-) -- Greg Ewing, Computer Science Dept, +--+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | [EMAIL PROTECTED] +--+ ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
