Re: [Python-Dev] PythonCore\CurrentVersion
> What happened to the CurrentVersion registry entry documented at > > http://www.python.org/windows/python/registry.html > > AFAICT, even the python15.wse file did not fill a value in this > entry (perhaps I'm misinterpreting the wse file, though). > > So was this ever used? Why is it documented, and who documented it > (unfortunately, registry.html is not in cvs/subversion, either)? I believe I documented it many moons ago. I don't think CurrentVersion was ever implemented (or possibly was for a very short time before being removed). The "registered modules" concept was misguided and AFAIK is not used by anyone - IMO it should be deprecated (if not just removed!). Further, I believe the documentation in the file for PYTHONPATH is, as said in those docs, out of date, but that the comments in getpathp.c are correct. Cheers, Mark ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] New PEP 342 suggestion: result() and allow "return with arguments" in generators (was Re: PEP 342 suggestion: start(), __call__() and unwind_call() methods)
Nick Coghlan wrote: > Although, if StopIteration.result was a read-only property with the above > definition, wouldn't that give us the benefit of "one obvious way" to return > a > value from a coroutine without imposing any runtime cost on normal use of > StopIteration to finish an iterator? Sometimes I miss the obvious. There's a *much*, *much* better place to store the return value of a generator than on the StopIteration exception that it raises when it finishes. Just save the return value in the *generator*. And then provide a method on generators that is the functional equivalent of: def result(): # Finish the generator if it isn't finished already for step in self: pass return self._result # Return the result saved when the block finished It doesn't matter that a for loop swallows the StopIteration exception any more, because the return value is retrieved directly from the generator. I also like that this interface could still be used even if the work of getting the result is actually farmed off to a separate thread or process behind the scenes. Cheers, Nick. P.S. Here's what a basic trampoline scheduler without builtin asynchronous call support would look like if coroutines could return values directly. The bits that it cleans up are marked "NEW": import collections class Trampoline: """Manage communications between coroutines""" running = False def __init__(self): self.queue = collections.deque() def add(self, coroutine): """Request that a coroutine be executed""" self.schedule(coroutine) def run(self): result = None self.running = True try: while self.running and self.queue: func = self.queue.popleft() result = func() return result finally: self.running = False def stop(self): self.running = False def schedule(self, coroutine, stack=(), call_result=None, *exc): # Define the new pseudothread def pseudothread(): try: if exc: callee = coroutine.throw(call_result, *exc) else: callee = coroutine.send(call_result) except StopIteration: # NEW: no need to name exception # Coroutine finished cleanly if stack: # Send the result to the caller caller = stack[0] prev_stack = stack[1] # NEW: get result directly from callee self.schedule( caller, prev_stack, callee.result() ) except: # Coroutine finished with an exception if stack: # send the error back to the caller caller = stack[0] prev_stack = stack[1] self.schedule( caller, prev_stack, *sys.exc_info() ) else: # Nothing left in this pseudothread to # handle it, let it propagate to the # run loop raise else: # Coroutine isn't finished yet if callee is None: # Reschedule the current coroutine self.schedule(coroutine, stack) elif isinstance(callee, types.GeneratorType): # Make a call to another coroutine self.schedule(callee, (coroutine,stack)) elif iscallable(callee): # Make a blocking call in a separate thread self.schedule( threaded(callee), (coroutine,stack) ) else: # Raise a TypeError in the current coroutine self.schedule(coroutine, stack, TypeError, "Illegal argument to yield" ) # Add the new pseudothread to the execution queue self.queue.append(pseudothread) P.P.S. Here's the simple coroutine that threads out a c
Re: [Python-Dev] Proposed changes to PEP 343
Nick Coghlan wrote: >Anders J. Munch wrote: > >>Note that __with__ and __enter__ could be combined into one with no >>loss of functionality: >> >>abc,VAR = (EXPR).__with__() >> > >They can't be combined, because they're invoked on different objects. > Sure they can. The combined method first does what __with__ would have done to create abc, and then does whatever abc.__enter__ would have done. Since the type of 'abc' is always known to the author of __with__, this is trivial. Strictly speaking there's no guarantee that the type of 'abc' is known to the author of __with__, but I can't imagine an example where that would not be the case. >It would >be like trying to combine __iter__() and next() into the same method for >iterators. . . The with-statement needs two pieces of information from the expression: Which object to bind to the users's variable (VAR) and which object takes care of block-exit cleanup (abc). A combined method would give these two equal standing rather than deriving one from the other. Nothing ugly about that. - Anders ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Proposed changes to PEP 343
On 10/9/05, Anders J. Munch <[EMAIL PROTECTED]> wrote: > Nick Coghlan wrote: > >Anders J. Munch wrote: > > > >>Note that __with__ and __enter__ could be combined into one with no > >>loss of functionality: > >> > >>abc,VAR = (EXPR).__with__() > >> > > > >They can't be combined, because they're invoked on different objects. > > > > Sure they can. The combined method first does what __with__ would > have done to create abc, and then does whatever abc.__enter__ would > have done. Since the type of 'abc' is always known to the author of > __with__, this is trivial. I'm sure it can be done, but I find this ugly API design. While I'm not keen on complicating the API, the decimal context example has convinced me that it's necessary. The separation into __with__ which asks EXPR for a context manager and __enter__ / __exit__ which handle try/finally feels right. An API returning a tuple is asking for bugs. -- --Guido van Rossum (home page: http://www.python.org/~guido/) ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] New PEP 342 suggestion: result() and allow "return with arguments" in generators (was Re: PEP 342 suggestion: start(), __call__() and unwind_call() methods)
On 10/9/05, Nick Coghlan <[EMAIL PROTECTED]> wrote:
> Sometimes I miss the obvious. There's a *much*, *much* better place to store
> the return value of a generator than on the StopIteration exception that it
> raises when it finishes. Just save the return value in the *generator*.
>
> And then provide a method on generators that is the functional equivalent of:
>
> def result():
> # Finish the generator if it isn't finished already
> for step in self:
> pass
> return self._result # Return the result saved when the block finished
>
> It doesn't matter that a for loop swallows the StopIteration exception any
> more, because the return value is retrieved directly from the generator.
Actually, I don't like this at all. It harks back to earlier proposals
where state was stored on the generator (e.g. PEP 288).
> I also like that this interface could still be used even if the work of
> getting the result is actually farmed off to a separate thread or process
> behind the scenes.
That seems an odd use case for generators, better addressed by
creating an explicit helper object when the need exists. I bet that
object will need to exist anyway to hold other information related to
the exchange of information between threads (like a lock or a Queue).
Looking at your example, I have to say that I find the trampoline
example from PEP 342 really hard to understand. It took me several
days to get it after Phillip first put it in the PEP, and that was
after having reconstructed the same functionality independently. (I
have plans to replace or augment it with a different set of examples,
but haven't gotten the time. Old story...) I don't think that
something like that ought to be motivating generator extensions. I
also think that using a thread for async I/O is the wrong approach --
if you wanted to use threads shou should be using threads and you
wouldn't be dealing with generators. There's a solution that uses
select() which can handle as many sockets as you want without threads
and without the clumsy polling ("is it ready yet? is it ready yet? is
it ready yet?").
I urge you to leave well enough alone. There's room for extensions
after people have built real systems with the raw material provided by
PEP 342 and 343.
--
--Guido van Rossum (home page: http://www.python.org/~guido/)
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] defaultproperty (was: Re: RFC: readproperty)
Based on the discussion, I think I'd go with defaultproperty. Questions: - Should this be in builtins, alongside property, or in a library module? (Oleg suggested propertytools.) - Do we need a short PEP? Jim Jim Fulton wrote: > Guido van Rossum wrote: > >>On 9/28/05, Jim Fulton <[EMAIL PROTECTED]> wrote: >> > > ... > >>I think we need to be real careful with chosing a name -- in Jim's >>example, *anyone* could assign to Spam().eggs to override the value. >>The name "readproperty" is too close to "readonlyproperty", > > > In fact, property creates read-only properties for new-style classes. > (I hadn't realized, until reading this thread, that for classic > classes, you could still set the attribute.) > > > but > >>read-only it ain't! "Lazy" also doesn't really describe what's going >>on. > > > I agree. > > >>I believe some folks use a concept of "memo functions" which resemble >>this proposal except the notation is different: IIRC a memo function >>is always invoked as a function, but stores its result in a private >>instance variable, which it returns upon subsequent calls. This is a >>common pattern. Jim's proposal differs because the access looks like >>an attribute, not a method call. Still, perhaps memoproperty would be >>a possible name. >> >>Another way to look at the naming problem is to recognize that the >>provided function really computes a default value if the attribute >>isn't already set. So perhaps defaultproperty? > > > Works for me. > > Oleg Broytmann wrote: > > On Wed, Sep 28, 2005 at 10:16:12AM -0400, Jim Fulton wrote: > > > >> class readproperty(object): > > > > [skip] > > > >>I do this often enough > > > > > >I use it since about 2000 often enough under the name CachedAttribute: > > > > http://cvs.sourceforge.net/viewcvs.py/ppa/qps/qUtils.py > > Steven Bethard wrote: > > Jim Fulton wrote: > > > ... > > I've also needed behavior like this a few times, but I use a variant > > of Scott David Daniel's recipe[1]: > > > > class _LazyAttribute(object): > > > Yup, the Zope 3 sources have something very similar: > > http://svn.zope.org/Zope3/trunk/src/zope/cachedescriptors/property.py?view=markup > > I actually think this does too much. All it saves me, compared to what I > proposed > is one assignment. I'd rather make that assignment explicit. > > Anyway, all I wanted with readproperty was a property that implemented only > __get__, as opposed to property, which implements __get__, __set__, and > __delete__. > > I'd be happy to call it readproprty or getproperty or defaulproperty or > whatever. :) > > I'd prefer that it's semantics stay fairly simple though. > > > Jim > -- Jim Fulton mailto:[EMAIL PROTECTED] Python Powered! CTO (540) 361-1714http://www.python.org Zope Corporation http://www.zope.com http://www.zope.org ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] async IO and helper threads
Le dimanche 09 octobre 2005 à 07:46 -0700, Guido van Rossum a écrit : > I > also think that using a thread for async I/O is the wrong approach -- > if you wanted to use threads shou should be using threads and you > wouldn't be dealing with generators. There's a solution that uses > select() which can handle as many sockets as you want without threads > and without the clumsy polling select() works with sockets. But nothing else if you want to stay cross-platform, so async file IO and other things remain open questions. By the way, you don't need clumsy polling to wait for helper threads ;) You can just use a ConditionVariable from the threading package (or something else with the same semantics). BTW, I'm not arguing at all for the extension proposal. Integrating async stuff into generators does not need an API extension IMO. I'm already doing it in my scheduler. An example which just waits for an external command to finish and periodically spins a character in the meantime: http://svn.berlios.de/viewcvs/tasklets/trunk/examples/popen1.py?view=markup The scheduler code is here: http://svn.berlios.de/viewcvs/tasklets/trunk/softlets/core/switcher.py?view=markup Regards Antoine. ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Removing the block stack (was Re: PEP 343 and __with__)
Phillip J. Eby wrote: > Clearly, the cost of function calls in Python lies somewhere else, and I'd > probably look next at parameter tuple allocation, For simple calls where there aren't any *args or other such complications, it seems like it should be possible to just copy the args from the calling frame straight into the called one. Or is this already done these days? -- Greg Ewing, Computer Science Dept, +--+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | [EMAIL PROTECTED] +--+ ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] defaultproperty (was: Re: RFC: readproperty)
On 10/9/05, Jim Fulton <[EMAIL PROTECTED]> wrote: > Based on the discussion, I think I'd go with defaultproperty. Great. > Questions: > > - Should this be in builtins, alongside property, or in >a library module? (Oleg suggested propertytools.) > > - Do we need a short PEP? I think so. From the responses I'd say there's at most lukewarm interest (including from me). You might also want to drop it and just add it to your personal (or Zope's) library. -- --Guido van Rossum (home page: http://www.python.org/~guido/) ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Removing the block stack (was Re: PEP 343 and __with__)
At 01:33 PM 10/10/2005 +1300, Greg Ewing wrote: >Phillip J. Eby wrote: > > > Clearly, the cost of function calls in Python lies somewhere else, and I'd > > probably look next at parameter tuple allocation, > >For simple calls where there aren't any *args or other >such complications, it seems like it should be possible >to just copy the args from the calling frame straight >into the called one. > >Or is this already done these days? It's already done, if the number of arguments matches, the code flags are just so, etc. ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] New PEP 342 suggestion: result() and allow "return with arguments" in generators (was Re: PEP 342 suggestion: start(), __call__() and unwind_call() methods)
Nick Coghlan wrote: > Sometimes I miss the obvious. There's a *much*, *much* better place to store > the return value of a generator than on the StopIteration exception that it > raises when it finishes. Just save the return value in the *generator*. I'm not convinced that this is better, because it would make value-returning something specific to generators. On the other hand, raising StopIteration(value) is something that any iterator can easily do, whether it's implemented as a generator, a Python class, a C type, or whatever. Besides, it doesn't smell right to me -- sort of like returning a value from a function by storing it in a global rather than using a return statement. -- Greg Ewing, Computer Science Dept, +--+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | [EMAIL PROTECTED] +--+ ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 342 suggestion: start(), __call__() and unwind_call() methods
Guido van Rossum wrote: > Plus, Piet also remarked that the value is silently ignored > when the generator is used in a for-loop. ... I'd worry that accepting > "return X" would increase the occurrence of bugs caused by someone > habitually writing "return X" where they meant "yield X". Then have for-loops raise an exception if they get a StopIteration with something other than None as an argument. > I'd like to keep StopIteration really lightweight so it doesn't slow > down its use in other places. You could leave StopIteration itself alone altogether and have a subclass StopIterationWithValue for returning things. This would make the for-loop situation even safer, since then you could distinguish between falling off the end of a generator and executing 'return None' inside it. -- Greg Ewing, Computer Science Dept, +--+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | [EMAIL PROTECTED] +--+ ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Extending tuple unpacking
Guido van Rossum wrote: > I personally think this is adequately handled by writing: > > (first, second), rest = something[:2], something[2:] That's less than satisfying because it violates DRY three times (once for mentioning 'something' twice, once for mentioning the index twice, and once for needing to make sure the index agrees with the number of items on the LHS). > Argument lists are not tuples [*] and features of argument lists > should not be confused with features of tuple unpackings. I'm aware of the differences, but I still see a strong similarity where this particular feature is concerned. The pattern of thinking is the same: "I want to deal with the first n of these things individually, and the rest collectively." -- Greg Ewing, Computer Science Dept, +--+ University of Canterbury, | A citizen of NewZealandCorp, a | Christchurch, New Zealand | wholly-owned subsidiary of USA Inc. | [EMAIL PROTECTED] +--+ ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Extending tuple unpacking
On Sunday 09 October 2005 22:44, Greg Ewing wrote: > I'm aware of the differences, but I still see a strong > similarity where this particular feature is concerned. > The pattern of thinking is the same: "I want to deal > with the first n of these things individually, and the > rest collectively." Well stated. I'm in complete agreement on this matter. -Fred -- Fred L. Drake, Jr. ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] defaultproperty
Jim Fulton wrote: > Based on the discussion, I think I'd go with defaultproperty. > > Questions: > > - Should this be in builtins, alongside property, or in >a library module? (Oleg suggested propertytools.) > > - Do we need a short PEP? The much-discussed never-created decorators module, perhaps? Cheers, Nick. -- Nick Coghlan | [EMAIL PROTECTED] | Brisbane, Australia --- http://boredomandlaziness.blogspot.com ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Removing the block stack (was Re: PEP 343 and __with__)
On 10/6/05, Phillip J. Eby <[EMAIL PROTECTED]> wrote: > At 10:09 PM 10/5/2005 -0700, Neal Norwitz wrote: > >The general idea is to allocate the stack in one big hunk and just > >walk up/down it as functions are called/returned. This only means > >incrementing or decrementing pointers. This should allow us to avoid > >a bunch of copying and tuple creation/destruction. Frames would > >hopefully be the same size which would help. Note that even though > >there is a free list for frames, there could still be > >PyObject_GC_Resize()s often (or unused memory). WIth my idea, > >hopefully there would be better memory locality, which could speed > >things up. > > Yeah, unfortunately for your idea, generators would have to copy off bits > of the stack and then copy them back in, making generators slower. If it > weren't for that part, the idea would probably be a good one, as arguments, > locals, cells, and the block and value stacks could all be handled that > way, with the compiler treating all operations as base-pointer offsets, > thereby eliminating lots of more-complex pointer management in ceval.c and > frameobject.c. If we had these seperate stacks for each thread, would it be possible to also create a stack for generator calls? The current call operations could possibly do a check to see if the function being called is a generator (if they don't have a generator bit, could they, to speed this up?). This generator-specific stack would be used for the generator's frame and any calls it makes on each iteration. This may pose threat of a bottleneck, allocating a new stack in the heap for every generator call, but generators are generally iterated more than created and the stacks could be pooled, of course. I don't know as much as I'd like about the CPython internals, so I'm just throwing this out there for commenting by those in the know. ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Fwd: defaultproperty
Sorry, Nick. GMail, for some reason, doesn't follow the reply-to properly for python-dev. Forwarding to list now... On 10/9/05, Nick Coghlan <[EMAIL PROTECTED]> wrote: > Jim Fulton wrote: > > Based on the discussion, I think I'd go with defaultproperty. > > > > Questions: > > > > - Should this be in builtins, alongside property, or in > >a library module? (Oleg suggested propertytools.) > > > > - Do we need a short PEP? > > The much-discussed never-created decorators module, perhaps? > > Cheers, > Nick. Never created for a reason? lumping things together for having the similar usage semantics, but unrelated purposes, might be something to avoid and maybe that's why it hasn't happened yet for decorators. If ever there was a makethreadsafe decorator, it should go in the thread module, etc. I mean, come on, its like making a module just to store a bunch of unrelated types just to lump them together because they're types. Who wants that? ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
