Felix, 09.07.2010 05:39:
On Jul 4, 11:25 am, David Cournapeau wrote:
Well, I wish I did not have to use C, then :) For example, as a
contributor to numpy, it bothers me at a fundamental level that so
much of numpy is in C.
This is something that I have been thinking about recently. Python has
won quite a following in the scientific computing area, probably
especially because of great libraries such as numpy, scipy, pytables
etc. But it also seems python itself is falling further and further
behind in terms of performance and parallel processing abilities.
Well, at least its "parallel processing abilities" are quite good actually.
If you have really large computations, they usually run on more than one
computer (not just more than one processor). So you can't really get around
using something like MPI, in which case an additional threading layer is
basically worthless, regardless of the language you use. For computations,
threading keeps being highly overrated.
WRT a single machine, you should note that GPGPUs are a lot faster these
days than even multi-core CPUs. And Python has pretty good support for
GPUs, too.
Of course all that can be fixed by writing C modules (e.g. with the help
of cython), but that weakens the case for using python in the first
place.
Not at all. Look at Sage, for example. It's attractive because it provides
tons of functionality, all nicely glued together through a simple language
that even non-programmers can use efficiently and effectively. And its use
of Cython makes all of this easily extensible without crossing the gap of a
language border.
Stefan
--
http://mail.python.org/mailman/listinfo/python-list