performance critical Python features

Chris Angelico rosuav at gmail.com
Thu Jun 23 22:08:30 EDT 2011


On Fri, Jun 24, 2011 at 10:07 AM, Steven D'Aprano
<steve+comp.lang.python at pearwood.info> wrote:
> On Fri, 24 Jun 2011 04:00:17 +1000, Chris Angelico wrote:
>
>> On Fri, Jun 24, 2011 at 2:58 AM, Eric Snow <ericsnowcurrently at gmail.com>
>> wrote:
>>> So, which are the other pieces of Python that really need the heavy
>>> optimization and which are those that don't?  Thanks.
>>>
>>>
>> Things that are executed once (imports, class/func definitions) and
>
> You can't assume that either of those things are executed once. Consider
> this toy example:

Sure. I was talking in generalities; of course you can do expensive
operations frequently. If you wanted to, you could do this:

radius=5
circum=0
for i in range(10,1000):
    c=radius*calculate_pi_to_n_decimals(i)
    if c>circum: circum=c

Calculates the highest possible circumference of a circle of that
radius. Does this mean we now have to optimize the pi calculation
algorithm so it can be used in a tight loop? Well, apart from the fact
that this code is moronic, no. All you need to do is cache. (Although
I guess in a way that's an optimization of the algorithm. It's the
same optimization as is done for imports.)

But generally speaking, functions are called more often than they're
defined, especially when we're talking about tight loops. And while
your example could be written without the repeated definition:

def outer(a, b):
   x=b**2 - a**2
   return (x*a - b)*(x*b - a) - 1

results = [outer(a, b) for (a, b) in coordinate_pairs()]

(at least, I think this is the same functionality), if inner() were
recursive, that would be different. But recursive inner functions
aren't nearly as common as write-once-call-many functions.

ChrisA



More information about the Python-list mailing list