In a message of Tue, 10 Nov 2015 06:45:40 +1100, Ben Finney writes:
>So the remaining space of code that is safe for the proposed
>optimisation is trivially small. Why bother with such optimisations, if
>the only code that can benefit is *already* small and simple?

You have things backwards.
The reason that you want to optimise this is that it is small, simple, 
and slow_slow_slow_slow_slow.

It is the things that are complicated and large that usually aren't worth
optimising. You don't call them often enough for it to be worth it to
optimise them.  Your analysis and set up overhead will be more than
the speed gains you realise.  If you can find something that is small,
simple, and ubiquitous -- that's the place to look for performance
gains.

PyPy gets most of its speed by optimising loops.  Most loops in Python
run perfectly well without making use of any of the dynanism that is
potentially available -- "it's nice to have the ability to do something
that, as a matter of fact, I am not going to do right now".  But the
overhead for checking to see if I have made use of it is huge, which
is why 'set things up so that if it is never used, we can go pretty much
as fast as C' is a win.  After you have paid for your setup costs for
making the fast and the slow path, you get enough fast path activity
that you come out way ahead.

Laura




-- 
https://mail.python.org/mailman/listinfo/python-list

Reply via email to