Question about math.pi is mutable

Steven D'Aprano steve at pearwood.info
Sat Nov 7 22:50:04 EST 2015


On Sun, 8 Nov 2015 01:23 am, Marko Rauhamaa wrote:

> Bartc <bc at freeuk.com>:
> 
>> Not just my option. From this 2010 paper for example ('High performance
>> implementation of Python for CLI ...' by Antonio Cuni):
>>
>> "As a language, Python is very hard to implement efficiently: the
>> presence of highly dynamic constructs makes static analysis of
>> programs extremely difficult, thus preventing ahead of time (AOT)
>> compilers to generate efficient target code."
> 
> Correct. That's not Python's fault, however. Python should not try to
> placate the performance computing people. (Alas, it is now trying to do
> just that with the introduction of static typing annotation.)

That is factually incorrect.

The motive behind the introduction of typing annotations is not "speed",
but "correctness". Guido has expressed skepticism that typing annotations
can be used to improve the speed of Python programs, and has suggested that
the future of optimization is JIT compilers like PyPy rather than static
compilers like Nuitka.

[Aside: I have to disagree with Guido's thoughts on compilers. Even if JIT
compilers are ultimately faster for long-running code, they don't help much
for short-running code, and surely there is a lot of low-hanging fruit that
an optimizing compiler can perform. Victor Skinner is experimenting with a
version of CPython which, among other things, detects when built-ins have
not been modified, and can inline them for speed.]

If type annotations lead to faster code, that's a bonus, but it isn't why
they are being added to the language. They are added to improve correctness
and documentation, to enable type-checks to be performed by linters and
compile-time type-checkers.


>> Plenty of other people working on faster Pythons appear to have
>> trouble with its being so dynamic.
> 
> That's, literally, their problem.
> 
>> As one example of many, named constants (the 'const' feature earlier
>> in the thread), were used for 'switch' statements. Without 'const',
>> then I couldn't have a fast 'switch'.)
> 
> Fast, fast, fast. We have plenty of fast programming languages.

Yeah, but most of them suck. Why can't we have a fast programming language
that is as nice to use as Python? Why can't Python be fast?



> Python 
> should *not* sacrifice its dynamism (which the fast languages don't
> have) for speed. Python is already fast enough for most programming
> tasks.

Well, I don't know about anyone else, but I personally always want my
scripts and programs to run slower. I can't tell you how annoyed I was when
new-style classes became as fast or faster than old-style classes.

*wink*


[...]
>> I suppose my coding style is just different from people who write
>> Python. When I define a function f(), then it's always going to be
>> function f and is never going to change. A call to such a function can
>> therefore be streamlined.
> 
> Your point of view is really down-to-earth. It's slightly analogous to
> protesting against Unicode because you only ever need ASCII.

I don't think so, but in any case, Bart is *way* oversimplifying the
potential optimizations available. Function inlining depends on the
function being small enough that inlining it does more good than harm, but
it doesn't require that f never changes.

Perhaps Laura can confirm this, but I understand that PyPy can inline
functions. Simplified:

The compiler builds a fast path and a slow path, and a guard. If the
function is the one you expect, the interpreter takes the fast path, which
includes the inlined function. But if the function has been rebound, the
guard triggers, and the interpreter takes the slow path, which follows the
slow, dynamic implementation. In pseudo-code:

def spam():
    result = eggs(x)

will compile as if it were written like this:

def spam():
    if eggs is EXPECTED_EGGS:
        # inline code
        do_this()
        do_that()
        result = another_thing(x)
    else:
        result = eggs(x)


The compiler manages all the book-keeping for you, automatically, and as
functions grow and shrink in complexity, it will automatically decide which
ones can be inlined and which cannot. You write the simplest, most obvious
code which is easy to maintain, and the compiler managers the ugly
optimizations.


> You would be right in that Python programs hardly ever make use of or
> directly depend on the degrees of freedom Python affords.

That's certainly true. Every Python program pays a significant speed penalty
(perhaps as much as 10 x slower than it need be) for features which are
used by probably less than 1% of code.

The choice is not between "dynamic language" and "fast language". You can
have both. Smalltalk proves it. Javascript proves it. You just need smarter
compilers which can optimize the usual, non-dynamic cases while still
allowing dynamic code to work.

We often say that the reason Python is anything up to a factor of 100 times
slower than languages like Java is because Python is so dynamic, but that's
not true. Python is so slow because *Python compilers aren't smart enough*.
CPython is a relatively simple reference implementation which applies
hardly any optimizations.

Even such simple things as constant folding are slightly controversial! If
you go back to older versions of Python, code like this:

    x = 1 + 1

actually performed the addition at runtime, instead of being compiled to:

    x = 2

Believe it or not, even something as simple as that remains controversial,
with some folks (including Guido) expressing doubt that constant-folding is
worthwhile.

But PyPy proves that Python compilers can be smarter, and optimize more,
without sacrificing dynamicism.


> They are used 
> every here and there, however, and, most imporantly, they make it
> possible to define the semantics of the programming language in a sound,
> concise manner. That way it is easier to implement Python, learn
> Python and use Python correctly.
> 
> Imagine there was a national standard that defined that you could only
> sell hatchets that split wood in a downward movement. You might argue
> that hardly anyone swung a hatchet sideways, and it would be dangerous
> anyway. However, that kind of a standard would make it very difficult to
> manufacture the simple concept of a hatchet. The contraption would be
> expensive, fragile, heavy and difficult to use even for the intended
> up-down splitting purpose.

I don't think that's a good analogy.

A better analogy is, imagine that Ford built a car that allowed you to swap
out the radio, the doors, the tyres, even the engine, while the car was
moving. You could unplug the engine while driving down the road, and plug
in a new one, and the car would just keep going. As a consequence, the car
is significantly bigger and slower than most other cars, and occasionally
it is useful (when you get a flat tyre, you don't have to stop, you just
swap into place a new tyre and keep going) but generally it's not used much
(who swaps out the seat that they are sitting on?).

Ford call this car "Python".


> I wouldn't like Python to turn into such a contraption.




-- 
Steven




More information about the Python-list mailing list