Dhananjay wrote:
On Aug 7, 11:58 pm, Terry Reedy <[EMAIL PROTECTED]> wrote:
[EMAIL PROTECTED] wrote:
 > Are there any implications of using psyco ?

It compiles statements to machine code for each set of types used in the
statement or code block over the history of the run.  So code used
polymorphically with several combinations of types can end up with
several compiled versions (same as with C++ templates).  (But a few
extra megabytes in the running image is less of an issue than it was
even 5 or so years ago.)  And time spent compiling for a combination
used just once gains little.  So it works best with numeric code used
just for ints or floats.

Terry J. Reedy

But if site caching is indeed being adopted by so many dynamic language
runtime environments, I kind of wonder what makes python hold back from
bringing it in to its core. Is it that a question of time and effort,

Yes, and of priorities of the *current* volunteer developers, and of complexity and maintainability and their impact on being dependably correct.

or is there something that doesn't make it appropriate to python ?

How about less necessary. Python was designed to be extended by native code. The built-in functions and classes are built-in extensions. The built-in extensions in the stdlib are importable extensions**. *So are native-code 3rd party extensions*!

Numeric, Python's first killer app, and now numpy, which both wrap standard Fortran and C libraries, have been standard associated parts of 'Python' (CPython) as used in the science, engineering, and technical community for over a decade. There are features in Python that were introduced for their use. Python 3.0 has (if plans get fulfilled) a new multi-dimensional buffer class/interface/C-API (not sure of the details ;-) introduced by Travis Oliphant of NumPy for use by extension writers so extensions, built-in or 3rd-party, can work together (share C level date) without copying. For instance, the C-level data for an image imported into a cooperating image program could be directly manipulated by NumPy and then directly blitted on the screen by a display manager.


About benchmarks: machine speed benchmarks miss the point that the real limiting factor in information processing is programmer availability and writing and debugging speed. Python was optimized for *this*, not machine speed per se, knowing that most machine bottlenecks could be rewritten in C or another low-level language. Why reinvent that wheel? Measuring programmer productivity is hard, but it is real.

Number-crunching benchmarks which disallow NumPy because it is not distributed with the 'core' by PSF (although it *is* by other packagers), are simply braindead. If a scientist/engineer/technologist saves half an hour setting up a 10-hour run by using Python instead of C, it is *completely irrelevant* that the C interface to the number-crunching libraries might run in a 1/10 second instead of, say, 10 seconds. That is why such people developed Numeric and then NumPy.


None of this is to say that continued development of Psyco or equivalent for new versions would not be great. But that comes down to *someone* volunteering the resources.

Terry Jan Reedy



--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to