> About a month ago Steve Langasek and I discussed the state of Python > packages on IRC, in particular the effects of bytecode compilation; the > effectiveness (or lack thereof) of it, and how it tightens Python > dependencies. I'd like to propose three changes to how Python modules > are handled.
I have a couple of questions. Do you intend this proposal to apply to Python libraries, or Python applications, or both? I'm thinking that many applications (especially ones like EoC) would be built for just a single version of Python anyway. In this case, why would it matter whether we have pre-compiled bytecode around? What would you suggest doing about "hybrid" packages which are primarily applications, but also want to make their modules available to other Python programs? Two examples here are pychecker and epydoc (both maintained by me). Right now, for those packages, I stick the .py, .pyc and .pyo files in site-python, compile the modules for the default version of Python, and then live with the inefficiency of recompiling them for non-default Python versions and/or the possibility that root will recompile them for the wrong version. This isn't a great solution, but it works. However, some folks don't seem to like this solution very much, and I've definitely gotten some pushback about the structure of these packages. Finally, what do you suggest doing with packages that contain both pure-Python modules and C extensions? It seems that any package which contains a C extension is necessarily tied to a specific version of Python, and so might as well have pre-compiled modules. You didn't seem to address this in your proposal, but maybe that's because you're assuming that current policy is appropriate. Thanks, KEN -- Kenneth J. Pronovici <[EMAIL PROTECTED]>
signature.asc
Description: Digital signature