Bytecode optimisation

Christian Tismer tismer at appliedbiometrics.com
Sun May 23 09:30:36 EDT 1999


William Tanksley wrote:
...
> >> Or as I mentioned in my reply to Mark-Andre Lemburg, there may be a case
> >> for an optimisation level which assumes sanity for many standard things,
> >> so that
> 
> >> for x in range(1000):
> >>   for y in range(1000):
> >>     do_stuff()
> 
> >> could assume that do_stuff() doesn't change the meaning of range and so
> >> this could be optimised to:
> 
> >> r = range(1000)
> >> for x in r:
> >>   for y in r:
> >>     do_stuff()

[C. Tismer:]
> >Looks promising in the first place. But you quickly realize
> >that this is no real optimization since you will do it
> >by hand, anyway, or use xrange.
> 
> But _why_?  If I become really interested in speed, I don't want to slave
> away making mmy beautiful Python code look like C.  I would rather write
> in C.
> 
> And I would rather have _less_ incentive to write in C, and zero incentive
> to write baroque Python (for speed).  An autommated optimizer certainly
> could handle this.

I agree. But obvious optimizations are not my goal. A lot of
things can of course be done on the opcode level.
My point is that this whole area is bound to an upper
limit which I know.

> >The loop opcode will anyway continue to call a sequence_getitem,
> >wether it knows the type or not.
> 
> Ah, another place for optimization, eh?

The real key. This is where the real game starts.

> >> >if-you-get-better-than-twenty-percent-speed-up-I'll-be-impressed-ly y'rs
> >> >Michael
> 
> >Guys, you are at the wrong end. This is another complete waste
> >of time. The whole interpreter overhead is bound to an average
> >of about 30 percent, when everything is C-code. I'd wonder if
> >bytecode optimization would give more than 15 percent on
> >average, when semantics are not changed.
> 
> 15% is a KILLER speedup for optimization without algorithm changes.
> Personally, I would consider that a vindication -- but anything less is
> not obviously a loss.

I disagree. From Python 1.4 to 1.5, there was a 50 percent
speed increase. This was worth it.

15% is a nice result. But, I doubt it would cause
too many users up upgrade their installation.
Even after I had tweaked eval_code to run 11% faster
under Windows, do you think I use it?
Not wort enough to become incompatible to the
mainstream.

> This is one reasonn why I brought up token-threading.  Think of it as
> 16-bit "byte"codes.  16K instruction space...  or mmaybe even 4G.  Enough
> to assign a code to every user function in a program (for use only whenn
> we're certain, of course).

I'm not going to build static, huge monsters.
Optimization doesn't mean speed alone. There must
be a weighted harmony between speed and size.
You can't achive this statically.

> And Anton's finally published his stack-code optimizer at
> http://www.complang.tuwien.ac.at/projects/rafts.html.  It's not
> complete, and he hasn't published the source yet (but it's based on a
> GPLed product), but already it compiles Forth to as much as 80% of C speed
> (this is a special case, though).

Interesting. What is the special case? Forth, or the program
which he can make so fast?

> >no-longer-optimizing-for-less-than-50-percent-ly y'rs - chris
> 
> That's not a wrong attitude.  You'll accept no lesser result than
> innovation.  However, incremental gains are also valuable, and you
> shouldn't reject or even discourage those who attempt to make them.

This was not my intent. I just wanted to point out that I see
limited chances of gains there, and I'm more looking into
what has to be done to get special cases *very* fast.

I'm happy to leave the 15 percent area to others, since
it will happen anyway, with or without me. 

Just making sure that you will
not be disappointed, you cannot await much more than 
what I estimated.

good luck - chris

-- 
Christian Tismer             :^)   <mailto:tismer at appliedbiometrics.com>
Applied Biometrics GmbH      :     Have a break! Take a ride on Python's
Kaiserin-Augusta-Allee 101   :    *Starship* http://starship.python.net
10553 Berlin                 :     PGP key -> http://wwwkeys.pgp.net
PGP Fingerprint       E182 71C7 1A9D 66E9 9D15  D3CC D4D7 93E2 1FAE F6DF
     we're tired of banana software - shipped green, ripens at home




More information about the Python-list mailing list