Fortran

Steven D'Aprano steve+comp.lang.python at pearwood.info
Thu May 29 13:57:26 EDT 2014


On Thu, 29 May 2014 17:50:00 +0300, Marko Rauhamaa wrote:

> albert at spenarnc.xs4all.nl (Albert van der Horst):
> 
>> I always thought that the real point of JIT was that it can take
>> advantage of type information that is not available until runtime. If
>> it can infer that something is an integer, just before entering a loop
>> to be executed millions of times, that should be a big win, not?
> 
> JIT most generally refers to on-the-fly optimization of the code.

Using information available at run-time, which was Albert's point. Why 
would you delay compilation to run-time if you're only going to use 
information available at edit-time? You pay all the cost of delayed 
compilation and get none of the benefits.


> In the
> case of Java, the code to be executed is bytecode that is independent of
> the CPU instruction set and thus needs to be interpreted.

Not so. There are Java compilers which compile to machine code.

http://www.excelsiorjet.com/
http://gcc.gnu.org/java/

Both are AOT (Ahead Of Time) compilers, but that's not important for this 
discussion. If an AOT compiler can emit machine code, so can a JIT 
compiler, and that's exactly what PyPy does:

http://morepypy.blogspot.com.au/2011/08/visualization-of-jitted-code.html


> You could
> perform the optimization before the execution and save the executable
> binary, but the Java gods are reluctant to do that for ideological
> reasons ("compile once, run everywhere").

Who are these Java gods of which you speak?


> Python code, too, is compiled into interpreted bytecode. 

An unjustified assertion. The Python infrastructure is incredibly rich, 
and compilers include many different strategies:

- compile to byte-code for some virtual machine 
  (e.g. JVM, .Net, Parrot)

- static compilation to machine-code;

- JIT compilation to machine-code;

- translation to some other language (e.g. Haskell, Javascript) 
  followed by compilation of that code to machine code.

I've already mentioned that PyPy compiles to machine code. So did its 
predecessor, Psyco. Both are JIT compilers, as are Numba and Parakeet and 
Pyston and Unladen Swallow[1].

Then there's at least one AOT compiler, Nuitka, which produces machine 
code. It's not optimized yet, but there's only one guy working on this 
project, and it's still young.

http://nuitka.net/posts/static-compilation-that-is-the-point.html


> Again, you
> could compile it into machine code ahead of execution or perform the
> compilation on the fly with JIT techniques. 

You're talking as if this were only theoretical. It is not. The state of 
the art in compiler techniques has advanced a lot since the old days of 
the Pascal P-Machine. Parakeet, for example[2], compiles numeric 
functions to optimized machine code on the fly using decorators, using 
Static Single Assignment, which some wag described as being "covered in 
Fisher Price's My First Compiler"[3].


> However, Python is so
> ridiculously dynamic that such compilers have an extremely difficult
> time making effective optimizations.

Anyone who used Psyco ten years ago would laugh at that statement, and 
Psyco isn't even close to cutting edge.




[1] Unladen Swallow was not a failure. It's just that expectations were 
so high ("OMG OMG Google is making a Python compiler this will be more 
powerful than the Death Star and faster than time travel!!!1!") that it 
couldn't help but be a disappointment.

[2] http://www.phi-node.com/2013/01/a-transformative-journey-from-python-to.html

[3] http://blog.cdleary.com/2011/06/mapping-the-monkeysphere/


-- 
Steven D'Aprano
http://import-that.dreamwidth.org/



More information about the Python-list mailing list