Re: [Python-Dev] Minor change to Enum -- should it go into 3.5.2?
On 9 May 2016 at 12:39, Ethan Furman wrote: > On 05/08/2016 07:15 PM, Nick Coghlan wrote: >> Needing to use a PyPI alternative to a stdlib module for increased >> cross-version consistency is a pretty common experience these days, so >> I think that's OK - end users can choose for themselves between the >> stability of the stdlib version and the reduced update latency of the >> PyPI version. > > Are you saying I shouldn't bother updating the 3.6 Enum to ignore _order_? I don't have an opinion on the change itself (although keeping PyPI and 3.6-to-be aligned seems reasonable), just that it's OK to suggest folks use a backport module even on earlier 3.x versions if they want the behaviour of the latest (or even the next upcoming) stdlib release, rather than the baseline behaviour of their current runtime. Cheers, Nick. -- Nick Coghlan | [email protected] | Brisbane, Australia ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Return type of alternative constructors
On Sun, May 8, 2016 at 7:52 PM, Nick Coghlan wrote: > On 9 May 2016 at 08:50, Guido van Rossum wrote: > > On Sun, May 8, 2016 at 4:49 AM, Nick Coghlan wrote: > >> P.S. The potential complexity of that is one of the reasons the design > >> philosophy of "prefer composition to inheritance" has emerged - > >> subclassing is a powerful tool, but it does mean you often end up > >> needing to care about more interactions between the subclass and the > >> base class than you really wanted to. > > > > Indeed! > > > > We could also consider this a general weakness of the "alternative > > constructors are class methods" pattern. If instead these alternative > > constructors were folded into the main constructor (e.g. via special > keyword > > args) it would be altogether clearer what a subclass should do. > > Unfortunately, even that approach gets tricky when the inheritance > relationship crosses the boundary between components with independent > release cycles. > > In my experience, this timeline is the main one that causes the pain: > > * Base class is released in Component A (e.g. CPython) > * Subclass is released in Component B (e.g. PyPI module) > * Component A releases a new base class construction feature > > Question: does the new construction feature work with the existing > subclass in Component B if you combine it with the new version of > Component A? > > When alternate constructors can be implemented as class methods that > work by creating a default instance and using existing public API > methods to mutate it, then the answer to that question is "yes", since > the default constructor hasn't changed, and the new convenience > constructor isn't relying on any other new features. > > The answer is also "yes" for existing subclasses that only add new > behaviour without adding any new state, and hence just use the base > class __new__ and __init__ without overriding either of them. > > It's when the existing subclasses overrides __new__ or __init__ and > one or both of the following is true that things can get tricky: > > - you're working with an immutable type > - the API implementing the post-creation mutation is a new one > > In both of those cases, the new construction feature of the base class > probably won't work right without updates to the affected subclass to > support the new capability (whether that's supporting a new parameter > in __new__ and __init__, or adding their own implementation of the new > alternate constructor). > OTOH it's not the end of the world -- until B is updated, you can't use the new construction feature with subclass B, you have to use the old way of constructing instances of B. Presumably that old way is still supported, otherwise the change to A has just broken all of B, regardless of the new construction feature. Which is possible, but it's a choice that A's author has to make after careful deliberation. Or maybe the "class construction" machinery in A is so prominent that it really is part of the interface between A and any of its subclasses, and then that API had better be documented. Just saying "you can subclass this" won't be sufficient. > I'm genuinely unsure that's a solvable problem in the general case - > it seems to be an inherent consequence of the coupling between > subclasses and base classes during instance construction, akin to the > challenges with subclass compatibility of the unpickling APIs when a > base class adds new state. > Yup. My summary of it is that versioning sucks. > However, from a pragmatic perspective, the following approach seems to > work reasonably well: > > * assume subclasses don't change the signature of __new__ or __init__ > I still find that a distasteful choice, because *in general* there is no requirement like that and there are good reasons why subclasses might have a different __init__/__new__ signature. (For example dict and defaultdict.) > * note the assumptions about the default constructor signature in the > alternate constructor docs to let implementors of subclasses that > change the signature know they'll need to explicitly test > compatibility and perhaps provide their own implementation of the > alternate constructor > Yup, you end up having to design the API for subclasses carefully and then document it precisely. This is what people too often forget when they complain e.g. "why can't I subclass EventLoop more easily" -- we don't want to have a public API for that, so we discourage it, but people mistakenly believe that anything that's a class should be subclassable. > You *do* still end up with some cases where a subclass needs to be > upgraded before a new base class feature works properly for that > particular subclass, but subclasses that *don't* change the > constructor signature "just work". > The key is that there's an API requirement and that you have to design and document that API with future evolution in mind. If you don't do and let people write subclasses that just happen to work, y
Re: [Python-Dev] Wrong OSX platform in sysconfig.py causing installation problems
Thanks Joseph, I say this isn't worth pursuing until someone else reports the issue, as I couldn't replicate it with a "clean" system. I had the old 10.6 SDK in /Developer/SDKs, it could have been ‘special’ in > that I did alot of weird things for a former project and I just don’t > recall. > I suspect your 10.6 SDK was cnfigured oddly, or For the sake of explanation though lets say it wasn’t, what then? > > I installed OSX Python 3.5.1, created a virtual environment using the > pyvenv command from that newly installed Python 3.5.1, and sourced its > bin/activate. > Then when running pip install ZODB, the compiler would be passed the > -isysroot flag for the 10.6 SDK, which would change its include paths and > it would find no includes or libs > this is the odd part -- shouldn't the new sysroot (i.e. the SDK) have all the includes and libs needed? Not sure why that is, or what would make the virtual environment different, > but there you go. > yeah, me neither Anyway, unless someone else sees this, I think we're done. -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R(206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception [email protected] ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Wrong OSX platform in sysconfig.py causing installation problems
> On May 09 , 2016, at 11:03 AM, Chris Barker wrote: > > I installed OSX Python 3.5.1, created a virtual environment using the pyvenv > command from that newly installed Python 3.5.1, and sourced its bin/activate. > Then when running pip install ZODB, the compiler would be passed the > -isysroot flag for the 10.6 SDK, which would change its include paths and it > would find no includes or libs > > this is the odd part -- shouldn't the new sysroot (i.e. the SDK) have all the > includes and libs needed? > That is the weird part, if installing into the system python the new sysroot worked just fine. It was only when trying to install into a virtual environment that it failed. What the difference is I just don’t know. Thanks again for you efforts! Joseph___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] PyGC_Collect ignores state of `enabled`
Is this deliberate? Could we change it for 3.6 so PyGC_Collect does take `enabled` into account? Context: we’re experimenting with disabling the gc entirely for some workloads because it doesn’t fare too well with Copy-on-Write on Linux. What ends up happening is that disabling the gc actually drops memory usage significantly if you’re using hundreds of the same processes on one box :-) However, because of PyGC_Collect() called in Py_Finalize(), during interpreter shutdown the collection is done anyway, Linux does CoW and the memory usage spikes. Which is ironic on process shutdown. -- Lukasz Langa | Facebook Production Engineer | The Ministry of Silly Walks (+1) 650-681-7811 signature.asc Description: Message signed with OpenPGP using GPGMail ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PyGC_Collect ignores state of `enabled`
Adding PyGC_CollectIfEnabled() and calling it in Py_Finalize is probably fine. I don't think the contract of PyGC_Collect itself (or gc.collect() for that matter) should be changed. You might want to disable GC but invoke it yourself. On Mon, May 9, 2016, at 19:13, Łukasz Langa wrote: > Is this deliberate? Could we change it for 3.6 so PyGC_Collect does take > `enabled` into account? > > Context: we’re experimenting with disabling the gc entirely for some > workloads because it doesn’t fare too well with Copy-on-Write on Linux. > What ends up happening is that disabling the gc actually drops memory > usage significantly if you’re using hundreds of the same processes on one > box :-) > > However, because of PyGC_Collect() called in Py_Finalize(), during > interpreter shutdown the collection is done anyway, Linux does CoW and > the memory usage spikes. Which is ironic on process shutdown. > > -- > Lukasz Langa | Facebook > Production Engineer | The Ministry of Silly Walks > (+1) 650-681-7811 > > ___ > Python-Dev mailing list > [email protected] > https://mail.python.org/mailman/listinfo/python-dev > Unsubscribe: > https://mail.python.org/mailman/options/python-dev/benjamin%40python.org > Email had 1 attachment: > + signature.asc > 1k (application/pgp-signature) ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
