Question about math.pi is mutable

Steven D'Aprano steve at pearwood.info
Tue Nov 10 20:10:56 EST 2015


On Tue, 10 Nov 2015 11:14 pm, Ben Finney wrote:

>> Python -- yes, even CPython -- has a runtime compiler. When you import
>> a module, it is compiled (if needed) just before the import. Likewise,
>> when you call the `compile`, `eval` or `exec` built-ins, the compiler
>> operates.
>>
>> I'm not calling this a JIT compiler, because the simple-minded
>> compilation performed by `compile` etc doesn't use any run-time
>> information. It just statically compiles the code to byte-code.
> 
> That's what I though. I'm aware of JIT compilers, and was pretty sure
> Python doesn't have them. 

What do you think PyPy is then?

CPython is intentionally a pretty simple-minded compiler. Even no-brainer
constant folding is mildly controversial among some Python devs. But
CPython isn't "Python", it is just the reference implementation. Nuitka
aims to be an optimized AOT (Ahead Of Time) Python compiler, and PyPy is
already a state of the art (if not cutting edge) JIT Python compiler.

And if Victor Skinner's experimental FAT Python works out, even CPython
itself will gain some simple JIT techniques, rather similar to what Psycho
was doing ten years ago.


> What's more, if it operates at a whole-module 
> level, and not later than when the module is imported.

Now you're getting into implementation details of the JIT compiler. Can it
track entities across modules? How exactly does it operate?

PyPy is a "tracing JIT", which (if I have understood correctly) means it
actually analyses the code as it runs, using knowledge gained at runtime to
dynamically decide what to compile. This means that PyPy is best suited to
code that has long-running loops, and not well suited to scripts that don't
run for a long time. But another (hypothetical) JIT compiler might be more
like Psycho.


> Under those conditions, I maintain my objection to the proposed
> optimisation.
> 
> The proposal is explicitly for optimisations made by the Python
> compiler. As proposed, it only seems to be worthwhile once Python no
> longer has a distinct “compiler” and “interpreter” is dissolved. Until
> then, it seems pointless.

Python has not had a distinct "compiler" and "interpreter" since about
version 0.1. The execution module of Python as a dynamic byte-code
compiled, interpreted language with eval and exec *depends* on the compiler
being available during the execution phase, i.e. at runtime.

Some of the things I've suggested already exist, and have proven that they
work: Psycho and PyPy, to say nothing of similar technologies for other
equally dynamic languages such as Javascript and Smalltalk. Others are
state of the art compiler techniques -- they might not have been applied to
*Python* but that's only because nobody has got around to it, not because
it can't be done.

I daresay that there are implementation challenges to making Python faster
than it is without compromising on the semantics of the language, but there
is nothing fundamentally impossible about the idea.

Ben, I know that the IT crowd is rather conservative, and the Python
community even more so, but at the point where you are denying the
possibility of technologies which *already exist* I think that things have
got a bit out of hand. It's a bit like somebody getting on Twitter to
tweet "A globally interconnected communication network allowing computers
all over the world to communicate? Impossible! And even if it were
possible, nobody would use it."

;-)

 

-- 
Steven




More information about the Python-list mailing list