Bytecode optimisation

Christian Tismer tismer at appliedbiometrics.com
Wed May 19 17:14:07 EDT 1999


Graham Matthews wrote:
> 
> Christian Tismer (tismer at appliedbiometrics.com) wrote:
> : There is just a small socket of speed penalty for using Python.
> :
> : The bad thing is that when your objects become tiny and simple,
> : the necssary operations become very small, and what remains
> : is now the small Python socket, which is tuned into a mountain
> : by that.
> :
> : Therefore, I see no need to make Python faster in all cases,
> : but just in the small simple ones, where my collegues say
> : "hey, nice done in Python, but why is it 50 times faster in C?"
> :
> : Instead, I want Python to become as dumb as C for small
> : problems, whereby keeping its wonderful flexibility for
> : the rest.
> 
> My point was that I very very much doubt that you can do what you
> want, namely retain Python's flexibility AND have C like performance
> for the small things. Flexible semantics almost always imply poor
> performance. There are some exceptions to this rule, notably Self,
> but they are very rare, and tend to involve either excruiating
> design care from the start (eg. Self was really designed with the
> optimisations it does in mind), or a good deal of opportunism and
> luck. Since Python was not designed to be fast (rather it was
> designed to be flexible), you are going to have to be very very
> lucky to get your C performance.

Agreed, yes I know. Still, I cant give up thinking of this.
Of course it is a bad idea to want to do everything with
Python. This takes me down for a couple of days, and I use it
as a glue language and it's fine.

Next time, I have to write a Program which does something
more complex in the large, but needs a tiny but fast little
function in the inner loop. Say

def tinylittlefunction(a, b, type=type):
    assert type(a) == type([])
    for elem in a: assert type(elem) == type("")
    assert type(b) == type("") and len(c)==1
    # now perfoming a couple of iterations and
    # simple computations, all in plain Python without
    # any function calls.

I just added the assert stuff to clarify what my intent is.
All information is provided. Flexibility is limited by design.
Let's assume that here the majority of computation is sitting.

What should I do? Generate C source code, compile and link it?
Maybe, after writing a type and constraints inference
machine, which is no cakewalk. Well, it could even crank
the C compiler up automatically, although I still think
this could be done better.

I don't claim that optimization should work automatically,
but with a collaborative user?

Many problems can be tuned into simple, fast parts and
other, complex parts which need not change. I just want the 
new primitive function on-the-fly.

I'm banging my head on the keyboard since I think "it could
do it, since I see it. I told it everything. Be quick and dumb."

So why is it so hard to find
the right thing(TM)?

ciao - chris

-- 
Christian Tismer             :^)   <mailto:tismer at appliedbiometrics.com>
Applied Biometrics GmbH      :     Have a break! Take a ride on Python's
Kaiserin-Augusta-Allee 101   :    *Starship* http://starship.python.net
10553 Berlin                 :     PGP key -> http://wwwkeys.pgp.net
PGP Fingerprint       E182 71C7 1A9D 66E9 9D15  D3CC D4D7 93E2 1FAE F6DF
     we're tired of banana software - shipped green, ripens at home




More information about the Python-list mailing list